Check out this SlideShare to understand the challenges of BCBS 239 and learn ways to collect, measure, monitor and report on data to achieve better data integrity and data quality. Both G-SIBs and D-SIBS will learn how to help better govern their data.
Alignment: Office of the Chief Data Officer & BCBS 239Craig Milroy
Alignment: Office of the Chief Data Officer & BCBS 239. Alignment overview between OCDO framework and Principles for Effective Risk Data Aggregation and Risk Reporting.
1) The document discusses BCBS 239, a regulation from the Basel Committee on Banking Supervision that demands accurate and timely risk data reporting. It focuses on developing risk management capabilities rather than just compliance.
2) EY has developed a Risk Data Aggregation and Reporting (RDAR) Framework to help banks comply with BCBS 239 in a practical way by prioritizing capabilities and coordinating change efforts. The framework addresses key areas like data, processes, people and technology.
3) Banks face many challenges in coordinating regulatory changes from different rules with overlapping requirements. The RDAR Framework helps banks integrate priorities using common capability objectives to connect changes in a meaningful way.
This document discusses implementing the Basel Committee on Banking Supervision's Principles for Effective Risk Data Aggregation and Risk Reporting (BCBS-239). It begins with some questions and challenges around BCBS-239 implementation related to the scope of risk data aggregation, level of automation, compliance focus, regulator engagement, implementation guidance, target state definition, and progress measurement. The document then presents six themes for interpreting the BCBS-239 principles: materiality, flexibility, reconciliation and validation, transparency, automation and adaptation, and speed and confidentiality. It outlines a framework moving from siloed, manual arrangements to a totally integrated, automated environment. Benchmarks are suggested to measure compliance levels against this target state.
Enterprise architecture is a discipline that helps define, develop, and exploit boundaryless information flow capabilities to achieve an organization's strategic goals. It translates business vision and strategy into effective enterprise change by developing principles and models that describe the future state and enable evolution. Common enterprise architecture frameworks include TOGAF, Zachman Framework, FEAF, and DODAF, which provide standardized approaches and classifications.
DAS Slides: Data Governance - Combining Data Management with Organizational ...DATAVERSITY
Data Governance is both a technical and an organizational discipline, and getting Data Governance right requires a combination of Data Management fundamentals aligned with organizational change and stakeholder buy-in. Join Nigel Turner and Donna Burbank as they provide an architecture-based approach to aligning business motivation, organizational change, Metadata Management, Data Architecture and more in a concrete, practical way to achieve success in your organization.
Activate Data Governance Using the Data CatalogDATAVERSITY
This document discusses activating data governance using a data catalog. It compares active vs passive data governance, with active embedding governance into people's work through a catalog. The catalog plays a key role by allowing stewards to document definition, production, and usage of data in a centralized place. For governance to be effective, metadata from various sources must be consolidated and maintained in the catalog.
Data Architecture, Solution Architecture, Platform Architecture — What’s the ...DATAVERSITY
A solid data architecture is critical to the success of any data initiative. But what is meant by “data architecture”? Throughout the industry, there are many different “flavors” of data architecture, each with its own unique value and use cases for describing key aspects of the data landscape. Join this webinar to demystify the various architecture styles and understand how they can add value to your organization.
What is a secure enterprise architecture roadmap?Ulf Mattsson
Webcast title : What is a Secure Enterprise Architecture Roadmap?
Description : This session will cover the following topics:
* What is a Secure Enterprise Architecture roadmap (SEA)?
* Are there different Roadmaps for different industries?
* How does compliance fit in with a SEA?
* Does blockchain, GDPR, Cloud, and IoT conflict with compliance regulations complicating your SEA?
* How will quantum computing impact SEA roadmap?
Presenters : Juanita Koilpillai, Bob Flores, Mark Rasch, Ulf Mattsson, David Morris
Duration : 68 min
Date & Time : Sep 20 2018 8:00 am
Timezone : United States - New York
Webcast URL : https://www.brighttalk.com/webinar/what-is-a-secure-enterprise-architecture-roadmap
Alignment: Office of the Chief Data Officer & BCBS 239Craig Milroy
Alignment: Office of the Chief Data Officer & BCBS 239. Alignment overview between OCDO framework and Principles for Effective Risk Data Aggregation and Risk Reporting.
1) The document discusses BCBS 239, a regulation from the Basel Committee on Banking Supervision that demands accurate and timely risk data reporting. It focuses on developing risk management capabilities rather than just compliance.
2) EY has developed a Risk Data Aggregation and Reporting (RDAR) Framework to help banks comply with BCBS 239 in a practical way by prioritizing capabilities and coordinating change efforts. The framework addresses key areas like data, processes, people and technology.
3) Banks face many challenges in coordinating regulatory changes from different rules with overlapping requirements. The RDAR Framework helps banks integrate priorities using common capability objectives to connect changes in a meaningful way.
This document discusses implementing the Basel Committee on Banking Supervision's Principles for Effective Risk Data Aggregation and Risk Reporting (BCBS-239). It begins with some questions and challenges around BCBS-239 implementation related to the scope of risk data aggregation, level of automation, compliance focus, regulator engagement, implementation guidance, target state definition, and progress measurement. The document then presents six themes for interpreting the BCBS-239 principles: materiality, flexibility, reconciliation and validation, transparency, automation and adaptation, and speed and confidentiality. It outlines a framework moving from siloed, manual arrangements to a totally integrated, automated environment. Benchmarks are suggested to measure compliance levels against this target state.
Enterprise architecture is a discipline that helps define, develop, and exploit boundaryless information flow capabilities to achieve an organization's strategic goals. It translates business vision and strategy into effective enterprise change by developing principles and models that describe the future state and enable evolution. Common enterprise architecture frameworks include TOGAF, Zachman Framework, FEAF, and DODAF, which provide standardized approaches and classifications.
DAS Slides: Data Governance - Combining Data Management with Organizational ...DATAVERSITY
Data Governance is both a technical and an organizational discipline, and getting Data Governance right requires a combination of Data Management fundamentals aligned with organizational change and stakeholder buy-in. Join Nigel Turner and Donna Burbank as they provide an architecture-based approach to aligning business motivation, organizational change, Metadata Management, Data Architecture and more in a concrete, practical way to achieve success in your organization.
Activate Data Governance Using the Data CatalogDATAVERSITY
This document discusses activating data governance using a data catalog. It compares active vs passive data governance, with active embedding governance into people's work through a catalog. The catalog plays a key role by allowing stewards to document definition, production, and usage of data in a centralized place. For governance to be effective, metadata from various sources must be consolidated and maintained in the catalog.
Data Architecture, Solution Architecture, Platform Architecture — What’s the ...DATAVERSITY
A solid data architecture is critical to the success of any data initiative. But what is meant by “data architecture”? Throughout the industry, there are many different “flavors” of data architecture, each with its own unique value and use cases for describing key aspects of the data landscape. Join this webinar to demystify the various architecture styles and understand how they can add value to your organization.
What is a secure enterprise architecture roadmap?Ulf Mattsson
Webcast title : What is a Secure Enterprise Architecture Roadmap?
Description : This session will cover the following topics:
* What is a Secure Enterprise Architecture roadmap (SEA)?
* Are there different Roadmaps for different industries?
* How does compliance fit in with a SEA?
* Does blockchain, GDPR, Cloud, and IoT conflict with compliance regulations complicating your SEA?
* How will quantum computing impact SEA roadmap?
Presenters : Juanita Koilpillai, Bob Flores, Mark Rasch, Ulf Mattsson, David Morris
Duration : 68 min
Date & Time : Sep 20 2018 8:00 am
Timezone : United States - New York
Webcast URL : https://www.brighttalk.com/webinar/what-is-a-secure-enterprise-architecture-roadmap
Presentation: Enterprise Architecture design In 3 Minutes or soAdrian Grigoriu
The document provides an overview of enterprise architecture for Qantas Airline. It discusses the enterprise, including its structure and operations. It also discusses enterprise architecture, which provides a blueprint describing stakeholders' value streams and how technology and organizational resources execute them. Enterprise architecture enables enterprise-wide strategic roadmapping and transformation through project portfolio management. It helps streamline operations, increase agility, and provide competitive advantages.
You Need a Data Catalog. Do You Know Why?Precisely
The data catalog has become a popular discussion topic within data management and data governance circles. A data catalog is a central repository that contains metadata for describing data sets, how they are defined, and where to find them. TDWI research indicates that implementing a data catalog is a top priority among organizations we survey. The data catalog can also play an important part in the governance process. It provides features that help ensure data quality, compliance, and that trusted data is used for analysis. Without an in-depth knowledge of data and associated metadata, organizations cannot truly safeguard and govern their data.
Join this on-demand webinar to learn more about the data catalog and its role in data governance efforts.
Topics include:
· Data management challenges and priorities
· The modern data catalog – what it is and why it is important
· The role of the modern data catalog in your data quality and governance programs
· The kinds of information that should be in your data catalog and why
Data Integration, Access, Flow, Exchange, Transfer, Load And Extract Architec...Alan McSweeney
These notes describe a generalised data integration architecture framework and set of capabilities.
With many organisations, data integration tends to have evolved over time with many solution-specific tactical approaches implemented. The consequence of this is that there is frequently a mixed, inconsistent data integration topography. Data integrations are often poorly understood, undocumented and difficult to support, maintain and enhance.
Data interoperability and solution interoperability are closely related – you cannot have effective solution interoperability without data interoperability.
Data integration has multiple meanings and multiple ways of being used such as:
- Integration in terms of handling data transfers, exchanges, requests for information using a variety of information movement technologies
- Integration in terms of migrating data from a source to a target system and/or loading data into a target system
- Integration in terms of aggregating data from multiple sources and creating one source, with possibly date and time dimensions added to the integrated data, for reporting and analytics
- Integration in terms of synchronising two data sources or regularly extracting data from one data sources to update a target
- Integration in terms of service orientation and API management to provide access to raw data or the results of processing
There are two aspects to data integration:
1. Operational Integration – allow data to move from one operational system and its data store to another
2. Analytic Integration – move data from operational systems and their data stores into a common structure for analysis
The Role of Data Governance in a Data StrategyDATAVERSITY
A Data Strategy is a plan for moving an organization towards a more data-driven culture. A Data Strategy is often viewed as a technical exercise. A modern and comprehensive Data Strategy addresses more than just the data; it is a roadmap that defines people, process, and technology. The people aspect includes governance, the execution and enforcement of authority, and formalization of accountability over the management of the data.
In this RWDG webinar, Bob Seiner will share where Data Governance fits into an effective Data Strategy. As part of the strategy, the program must focus on the governance of people, process, and technology fixated on treating and leveraging data as a valued asset. Join us to learn about the role of Data Governance in a Data Strategy.
Bob will address the following in this webinar:
- A structure for delivery of a Data Strategy
- How to address people, process, and technology in a Data Strategy
- Why Data Governance is an important piece of a Data Strategy
- How to include Data Governance in the structure of the policy
- Examples of how governance has been included in a Data Strategy
Business Architecture the Key to Enterprise TransformationMike Walker
The document discusses business architecture and how it is transforming enterprise architecture. It provides an overview of business architecture, including definitions and frameworks. It outlines how business architecture delivers business value by connecting strategy to execution. It emphasizes the importance of understanding business needs, value streams, and delivering capabilities to address the "why" rather than just producing artifacts. The document shares proven practices from HP's experience delivering successful business architecture programs to customers.
Gartner: Master Data Management FunctionalityGartner
MDM solutions require tightly integrated capabilities including data modeling, integration, synchronization, propagation, flexible architecture, granular and packaged services, performance, availability, analysis, information quality management, and security. These capabilities allow organizations to extend data models, integrate and synchronize data in real-time and batch processes across systems, measure ROI and data quality, and securely manage the MDM solution.
Essential Reference and Master Data ManagementDATAVERSITY
Data tends to pile up and can be rendered unusable or obsolete without careful maintenance processes. Reference and Master Data Management (MDM) has been a popular Data Management approach to effectively gain mastery over not just the data but the supporting architecture for processing it. This webinar presents MDM as a strategic approach to improving and formalizing practices around those data items that provide context for many organizational transactions: its master data. Too often, MDM has been implemented technology-first and achieved the same very poor track record (one-third succeeding on-time, within budget, and achieving planned functionality). MDM success depends on a coordinated approach typically involving Data Governance and Data Quality activities.
Learning objectives:
- Understand foundational reference and MDM concepts based on the Data Management Body of Knowledge (DMBOK)
- Understand why these are an important component of your Data Architecture
- Gain awareness of Reference and MDM Frameworks and building blocks
- Know what MDM guiding principles consist of and best practices
- Know how to utilize reference and MDM in support of business strategy
Data Architecture - The Foundation for Enterprise Architecture and GovernanceDATAVERSITY
Organizations are faced with an increasingly complex data landscape, finding themselves unable to cope with exponentially increasing data volumes, compounded by additional regulatory requirements with increased fines for non-compliance. Enterprise architecture and data governance are often discussed at length, but often with different stakeholder audiences. This can result in complementary and sometimes conflicting initiatives rather than a focused, integrated approach. Data governance requires a solid data architecture foundation in order to support the pillars of enterprise architecture. In this session, IDERA’s Ron Huizenga will discuss a practical, integrated approach to effectively understand, define and implement an cohesive enterprise architecture and data governance discipline with integrated modeling and metadata management.
Customer-Centric Data Management for Better Customer ExperiencesInformatica
With consumer and business buyer expectations growing exponentially, more businesses are competing on the basis of customer experience. But executing preferred customer experiences requires data about who your customers are today and what will they likely need in the future. Every business can benefit from an AI-powered master data management platform to supply this information to line-of-business owners so they can execute great experiences at scale. This same need is true from an internal business process perspective as well. For example, many businesses require better data management practices to deliver preferred employee experiences. Informatica provides an MDM platform to solve for these examples and more.
Data Architecture Best Practices for Advanced AnalyticsDATAVERSITY
Many organizations are immature when it comes to data and analytics use. The answer lies in delivering a greater level of insight from data, straight to the point of need.
There are so many Data Architecture best practices today, accumulated from years of practice. In this webinar, William will look at some Data Architecture best practices that he believes have emerged in the past two years and are not worked into many enterprise data programs yet. These are keepers and will be required to move towards, by one means or another, so it’s best to mindfully work them into the environment.
Enterprise Architecture Implementation And The Open Group Architecture Framew...Alan McSweeney
The document discusses enterprise architecture and TOGAF. It defines enterprise architecture as a framework for addressing the increasing complexity of IT systems and poor alignment between business and IT needs. TOGAF provides a framework for developing enterprise architecture, with the goal of improving business-IT alignment and allowing organizations to better respond to changing business needs. The document outlines challenges in developing enterprise architecture and stresses the importance of balancing strategic planning with technology solutions.
Data Governance and Data Science to Improve Data QualityDATAVERSITY
Data Science uses systematic methods, algorithms, and systems to extract knowledge and insights from structured and unstructured data. Data Science requires high-quality data that is trusted by the organization and data scientists. Many organizations focus their Data Governance programs on improving Data Quality results. These three concepts (governance, science, and quality) seem to be made for each other.
In this RWDG webinar, Bob Seiner and his special guest will discuss how the people focusing on Data Governance and Data Science must work together to improve the level of confidence the organization has in its most critical data assets. Heavy investments are being made in Data Science but not so much for Data Governance. Bob will talk about how Data Governance and Data Science must work together to improve Data Quality.
Slides from a presentation given by Paul Turner to meetings of IIBA UK on 16 July and 12 August 2014.
Much has been written about technical and solution architectures, without due attention being given to how these work together with the Business Architecture.
It is easy to believe that those who are involved in business analysis, requirements definition and systems modelling do not need to consider the Business Architecture at all. This could not be further from the truth. This talk explains the rationale behind Business Architecture, what its main components are and why Business Analysts should ensure that they understand it and the influence it is likely to have on their work.
Data Governance Trends and Best Practices To Implement TodayDATAVERSITY
1) The document discusses best practices for data protection on Google Cloud, including setting data policies, governing access, classifying sensitive data, controlling access, encryption, secure collaboration, and incident response.
2) It provides examples of how to limit access to data and sensitive information, gain visibility into where sensitive data resides, encrypt data with customer-controlled keys, harden workloads, run workloads confidentially, collaborate securely with untrusted parties, and address cloud security incidents.
3) The key recommendations are to protect data at rest and in use through classification, access controls, encryption, confidential computing; securely share data through techniques like secure multi-party computation; and have an incident response plan to quickly address threats.
Business Value Through Reference and Master Data StrategiesDATAVERSITY
Data tends to pile up and can be rendered unusable or obsolete without careful maintenance processes. Reference and Master Data Management (MDM) has been a popular Data Management approach to effectively gain mastery over not just the data but the supporting architecture for processing it. This webinar presents MDM as a strategic approach to improving and formalizing practices around those data items that provide context for many organizational transactions — the master data. Too often, MDM has been implemented technology-first and achieved the same very poor track record (one-third succeeding on time, within budget, and achieving planned functionality). MDM success depends on a coordinated approach, typically involving Data Governance and Data Quality activities.
Learning Objectives:
• Understand foundational reference and MDM concepts based on the Data Management Body of Knowledge (DMBoK)
• Understand why these are an important component of your Data Architecture
• Gain awareness of reference and MDM frameworks and building blocks
• Know what MDM guiding principles consist of and best practices
• Know how to utilize reference and MDM in support of business strategy
This document discusses data governance and data architecture. It introduces data governance as the processes for managing data, including deciding data rights, making data decisions, and implementing those decisions. It describes how data architecture relates to data governance by providing patterns and structures for governing data. The document presents some common data architecture patterns, including a publish/subscribe pattern where a publisher pushes data to a hub and subscribers pull data from the hub. It also discusses how data architecture can support data governance goals through approaches like a subject area data model.
This practical presentation will cover the most important and impactful artifacts and deliverables needed to implement and sustain governance. Rather than speak hypothetically about what output is needed from governance, it covers and reviews artifact templates to help you re-create them in your organization.
Topics covered:
- Which artifacts are most important to get started
- Important artifacts for more mature programs
- How to ensure the artifacts are used and implemented, not just written
- How to integrate governance artifacts into operational processes
- Who should be involved in creating the deliverables
Enterprise Architecture vs. Data ArchitectureDATAVERSITY
Enterprise Architecture (EA) provides a visual blueprint of the organization, and shows key interrelationships between data, process, applications, and more. By abstracting these assets in a graphical view, it’s possible to see key interrelationships, particularly as they relate to data and its business impact across the organization. Join us for a discussion on how data architecture is a key component of an overall enterprise architecture for enhanced business value and success.
RWDG Slides: What is a Data Steward to do?DATAVERSITY
Most people recognize that Data Stewards play an essential role in their Data Governance and Information Governance programs. However, the manner in which Data Stewards are used is not the same from organization to organization. How you use Data Stewards depends on your goals for Data Governance.
Join Bob Seiner for this month’s RWDG webinar where he will share different ways to activate Data Stewards based on the purpose of your program. Bob will talk about options to extend existing Data Steward activity and how to build new functionality into the role of your Data Stewards.
In this webinar, Bob will discuss:
- The crucial role of the Data Steward in Data Governance
- Different types of Data Stewards and what they do
- Aligning Data Steward activities with program goals
- Improving existing Data Steward actions
- Finding new ways to use your Data Stewards
Reference matter data management:
Two categories of structured data :
Master data: is data associated with core business entities such as customer, product, asset, etc.
Transaction data: is the recording of business transactions such as orders in manufacturing, loan and credit card payments in banking, and product sales in retail.
Reference data: is any kind of data that is used solely to categorize other data found in a database, or solely for relating data in a database to information beyond the boundaries of the enterprise .
ModelDrivers the BCBS239 agile data management frameworkGreg Soulsby
The document discusses an agile data management framework for complying with BCBS 239 regulations, which require banks to aggregate risk data and reporting. It proposes using a Data Point Model and ModelDR tool to break down data into basic units and link them dynamically. This would allow banks to integrate risk data from different systems into reports more efficiently and at scale through reverse engineering existing data formats and databases, designing new data models, and generating customized reports and queries.
Presentation: Enterprise Architecture design In 3 Minutes or soAdrian Grigoriu
The document provides an overview of enterprise architecture for Qantas Airline. It discusses the enterprise, including its structure and operations. It also discusses enterprise architecture, which provides a blueprint describing stakeholders' value streams and how technology and organizational resources execute them. Enterprise architecture enables enterprise-wide strategic roadmapping and transformation through project portfolio management. It helps streamline operations, increase agility, and provide competitive advantages.
You Need a Data Catalog. Do You Know Why?Precisely
The data catalog has become a popular discussion topic within data management and data governance circles. A data catalog is a central repository that contains metadata for describing data sets, how they are defined, and where to find them. TDWI research indicates that implementing a data catalog is a top priority among organizations we survey. The data catalog can also play an important part in the governance process. It provides features that help ensure data quality, compliance, and that trusted data is used for analysis. Without an in-depth knowledge of data and associated metadata, organizations cannot truly safeguard and govern their data.
Join this on-demand webinar to learn more about the data catalog and its role in data governance efforts.
Topics include:
· Data management challenges and priorities
· The modern data catalog – what it is and why it is important
· The role of the modern data catalog in your data quality and governance programs
· The kinds of information that should be in your data catalog and why
Data Integration, Access, Flow, Exchange, Transfer, Load And Extract Architec...Alan McSweeney
These notes describe a generalised data integration architecture framework and set of capabilities.
With many organisations, data integration tends to have evolved over time with many solution-specific tactical approaches implemented. The consequence of this is that there is frequently a mixed, inconsistent data integration topography. Data integrations are often poorly understood, undocumented and difficult to support, maintain and enhance.
Data interoperability and solution interoperability are closely related – you cannot have effective solution interoperability without data interoperability.
Data integration has multiple meanings and multiple ways of being used such as:
- Integration in terms of handling data transfers, exchanges, requests for information using a variety of information movement technologies
- Integration in terms of migrating data from a source to a target system and/or loading data into a target system
- Integration in terms of aggregating data from multiple sources and creating one source, with possibly date and time dimensions added to the integrated data, for reporting and analytics
- Integration in terms of synchronising two data sources or regularly extracting data from one data sources to update a target
- Integration in terms of service orientation and API management to provide access to raw data or the results of processing
There are two aspects to data integration:
1. Operational Integration – allow data to move from one operational system and its data store to another
2. Analytic Integration – move data from operational systems and their data stores into a common structure for analysis
The Role of Data Governance in a Data StrategyDATAVERSITY
A Data Strategy is a plan for moving an organization towards a more data-driven culture. A Data Strategy is often viewed as a technical exercise. A modern and comprehensive Data Strategy addresses more than just the data; it is a roadmap that defines people, process, and technology. The people aspect includes governance, the execution and enforcement of authority, and formalization of accountability over the management of the data.
In this RWDG webinar, Bob Seiner will share where Data Governance fits into an effective Data Strategy. As part of the strategy, the program must focus on the governance of people, process, and technology fixated on treating and leveraging data as a valued asset. Join us to learn about the role of Data Governance in a Data Strategy.
Bob will address the following in this webinar:
- A structure for delivery of a Data Strategy
- How to address people, process, and technology in a Data Strategy
- Why Data Governance is an important piece of a Data Strategy
- How to include Data Governance in the structure of the policy
- Examples of how governance has been included in a Data Strategy
Business Architecture the Key to Enterprise TransformationMike Walker
The document discusses business architecture and how it is transforming enterprise architecture. It provides an overview of business architecture, including definitions and frameworks. It outlines how business architecture delivers business value by connecting strategy to execution. It emphasizes the importance of understanding business needs, value streams, and delivering capabilities to address the "why" rather than just producing artifacts. The document shares proven practices from HP's experience delivering successful business architecture programs to customers.
Gartner: Master Data Management FunctionalityGartner
MDM solutions require tightly integrated capabilities including data modeling, integration, synchronization, propagation, flexible architecture, granular and packaged services, performance, availability, analysis, information quality management, and security. These capabilities allow organizations to extend data models, integrate and synchronize data in real-time and batch processes across systems, measure ROI and data quality, and securely manage the MDM solution.
Essential Reference and Master Data ManagementDATAVERSITY
Data tends to pile up and can be rendered unusable or obsolete without careful maintenance processes. Reference and Master Data Management (MDM) has been a popular Data Management approach to effectively gain mastery over not just the data but the supporting architecture for processing it. This webinar presents MDM as a strategic approach to improving and formalizing practices around those data items that provide context for many organizational transactions: its master data. Too often, MDM has been implemented technology-first and achieved the same very poor track record (one-third succeeding on-time, within budget, and achieving planned functionality). MDM success depends on a coordinated approach typically involving Data Governance and Data Quality activities.
Learning objectives:
- Understand foundational reference and MDM concepts based on the Data Management Body of Knowledge (DMBOK)
- Understand why these are an important component of your Data Architecture
- Gain awareness of Reference and MDM Frameworks and building blocks
- Know what MDM guiding principles consist of and best practices
- Know how to utilize reference and MDM in support of business strategy
Data Architecture - The Foundation for Enterprise Architecture and GovernanceDATAVERSITY
Organizations are faced with an increasingly complex data landscape, finding themselves unable to cope with exponentially increasing data volumes, compounded by additional regulatory requirements with increased fines for non-compliance. Enterprise architecture and data governance are often discussed at length, but often with different stakeholder audiences. This can result in complementary and sometimes conflicting initiatives rather than a focused, integrated approach. Data governance requires a solid data architecture foundation in order to support the pillars of enterprise architecture. In this session, IDERA’s Ron Huizenga will discuss a practical, integrated approach to effectively understand, define and implement an cohesive enterprise architecture and data governance discipline with integrated modeling and metadata management.
Customer-Centric Data Management for Better Customer ExperiencesInformatica
With consumer and business buyer expectations growing exponentially, more businesses are competing on the basis of customer experience. But executing preferred customer experiences requires data about who your customers are today and what will they likely need in the future. Every business can benefit from an AI-powered master data management platform to supply this information to line-of-business owners so they can execute great experiences at scale. This same need is true from an internal business process perspective as well. For example, many businesses require better data management practices to deliver preferred employee experiences. Informatica provides an MDM platform to solve for these examples and more.
Data Architecture Best Practices for Advanced AnalyticsDATAVERSITY
Many organizations are immature when it comes to data and analytics use. The answer lies in delivering a greater level of insight from data, straight to the point of need.
There are so many Data Architecture best practices today, accumulated from years of practice. In this webinar, William will look at some Data Architecture best practices that he believes have emerged in the past two years and are not worked into many enterprise data programs yet. These are keepers and will be required to move towards, by one means or another, so it’s best to mindfully work them into the environment.
Enterprise Architecture Implementation And The Open Group Architecture Framew...Alan McSweeney
The document discusses enterprise architecture and TOGAF. It defines enterprise architecture as a framework for addressing the increasing complexity of IT systems and poor alignment between business and IT needs. TOGAF provides a framework for developing enterprise architecture, with the goal of improving business-IT alignment and allowing organizations to better respond to changing business needs. The document outlines challenges in developing enterprise architecture and stresses the importance of balancing strategic planning with technology solutions.
Data Governance and Data Science to Improve Data QualityDATAVERSITY
Data Science uses systematic methods, algorithms, and systems to extract knowledge and insights from structured and unstructured data. Data Science requires high-quality data that is trusted by the organization and data scientists. Many organizations focus their Data Governance programs on improving Data Quality results. These three concepts (governance, science, and quality) seem to be made for each other.
In this RWDG webinar, Bob Seiner and his special guest will discuss how the people focusing on Data Governance and Data Science must work together to improve the level of confidence the organization has in its most critical data assets. Heavy investments are being made in Data Science but not so much for Data Governance. Bob will talk about how Data Governance and Data Science must work together to improve Data Quality.
Slides from a presentation given by Paul Turner to meetings of IIBA UK on 16 July and 12 August 2014.
Much has been written about technical and solution architectures, without due attention being given to how these work together with the Business Architecture.
It is easy to believe that those who are involved in business analysis, requirements definition and systems modelling do not need to consider the Business Architecture at all. This could not be further from the truth. This talk explains the rationale behind Business Architecture, what its main components are and why Business Analysts should ensure that they understand it and the influence it is likely to have on their work.
Data Governance Trends and Best Practices To Implement TodayDATAVERSITY
1) The document discusses best practices for data protection on Google Cloud, including setting data policies, governing access, classifying sensitive data, controlling access, encryption, secure collaboration, and incident response.
2) It provides examples of how to limit access to data and sensitive information, gain visibility into where sensitive data resides, encrypt data with customer-controlled keys, harden workloads, run workloads confidentially, collaborate securely with untrusted parties, and address cloud security incidents.
3) The key recommendations are to protect data at rest and in use through classification, access controls, encryption, confidential computing; securely share data through techniques like secure multi-party computation; and have an incident response plan to quickly address threats.
Business Value Through Reference and Master Data StrategiesDATAVERSITY
Data tends to pile up and can be rendered unusable or obsolete without careful maintenance processes. Reference and Master Data Management (MDM) has been a popular Data Management approach to effectively gain mastery over not just the data but the supporting architecture for processing it. This webinar presents MDM as a strategic approach to improving and formalizing practices around those data items that provide context for many organizational transactions — the master data. Too often, MDM has been implemented technology-first and achieved the same very poor track record (one-third succeeding on time, within budget, and achieving planned functionality). MDM success depends on a coordinated approach, typically involving Data Governance and Data Quality activities.
Learning Objectives:
• Understand foundational reference and MDM concepts based on the Data Management Body of Knowledge (DMBoK)
• Understand why these are an important component of your Data Architecture
• Gain awareness of reference and MDM frameworks and building blocks
• Know what MDM guiding principles consist of and best practices
• Know how to utilize reference and MDM in support of business strategy
This document discusses data governance and data architecture. It introduces data governance as the processes for managing data, including deciding data rights, making data decisions, and implementing those decisions. It describes how data architecture relates to data governance by providing patterns and structures for governing data. The document presents some common data architecture patterns, including a publish/subscribe pattern where a publisher pushes data to a hub and subscribers pull data from the hub. It also discusses how data architecture can support data governance goals through approaches like a subject area data model.
This practical presentation will cover the most important and impactful artifacts and deliverables needed to implement and sustain governance. Rather than speak hypothetically about what output is needed from governance, it covers and reviews artifact templates to help you re-create them in your organization.
Topics covered:
- Which artifacts are most important to get started
- Important artifacts for more mature programs
- How to ensure the artifacts are used and implemented, not just written
- How to integrate governance artifacts into operational processes
- Who should be involved in creating the deliverables
Enterprise Architecture vs. Data ArchitectureDATAVERSITY
Enterprise Architecture (EA) provides a visual blueprint of the organization, and shows key interrelationships between data, process, applications, and more. By abstracting these assets in a graphical view, it’s possible to see key interrelationships, particularly as they relate to data and its business impact across the organization. Join us for a discussion on how data architecture is a key component of an overall enterprise architecture for enhanced business value and success.
RWDG Slides: What is a Data Steward to do?DATAVERSITY
Most people recognize that Data Stewards play an essential role in their Data Governance and Information Governance programs. However, the manner in which Data Stewards are used is not the same from organization to organization. How you use Data Stewards depends on your goals for Data Governance.
Join Bob Seiner for this month’s RWDG webinar where he will share different ways to activate Data Stewards based on the purpose of your program. Bob will talk about options to extend existing Data Steward activity and how to build new functionality into the role of your Data Stewards.
In this webinar, Bob will discuss:
- The crucial role of the Data Steward in Data Governance
- Different types of Data Stewards and what they do
- Aligning Data Steward activities with program goals
- Improving existing Data Steward actions
- Finding new ways to use your Data Stewards
Reference matter data management:
Two categories of structured data :
Master data: is data associated with core business entities such as customer, product, asset, etc.
Transaction data: is the recording of business transactions such as orders in manufacturing, loan and credit card payments in banking, and product sales in retail.
Reference data: is any kind of data that is used solely to categorize other data found in a database, or solely for relating data in a database to information beyond the boundaries of the enterprise .
ModelDrivers the BCBS239 agile data management frameworkGreg Soulsby
The document discusses an agile data management framework for complying with BCBS 239 regulations, which require banks to aggregate risk data and reporting. It proposes using a Data Point Model and ModelDR tool to break down data into basic units and link them dynamically. This would allow banks to integrate risk data from different systems into reports more efficiently and at scale through reverse engineering existing data formats and databases, designing new data models, and generating customized reports and queries.
How Cognizant's ZDLC solution is helping Data Lineage for compliance to Basel...Dr. Bippin Makoond
A solution powered by Cognizant ZDLC framework to accelerate the process of data extraction and improve the precision of the end to end data lineage of systems using automation techniques.
A solution designed for the BCBS 239 Initiative.
Improving data quality & complying with BCBS239Alrick Dupuis
This document present our views and capabilities regarding Data Quality Management & BCBS 239. Improving decision capabilities through better data quality is a way to go beyond BCBS 239 compliance by making it useful for business efficiency.
London Financial Modelling Group 2015 04 30 - Model driven solutions to BCBS239Greg Soulsby
The London Financial Modelling Group meeting of 2015 04 30 - Model driven solutions to BCBS239.
You will learn how to:
– Demonstrate compliance to each of the principles by re-purposing your information architecture
– Meet the obligations more efficiently by leveraging FIBO and semantic technology
– Show your management team how data architecture helps meet BCBS239 using his “BCSB239 Model driven solutions checklist” tool.
Frukostseminarium riskdataaggregering och riskrapporteringTranscendent Group
Frukostseminarium om riskdataaggregering och riskrapportering, 29 november 2014 Rigoletto.
Talare: Elisabeth Antonsson, Nordea och Kristofer Söderholm, Transcendent Group
How Ally Financial Achieved Regulatory Compliance with the Data Management Ma...DATAVERSITY
A Data Management Maturity Model Case Study
Ally Financial Inc., previously known as GMAC Inc., is a bank holding company headquartered in Detroit, Michigan. Ally has more than 15 million customers worldwide, serving over 16,000 auto dealers in the US. In 2009 Ally Bank was launched – at present it has over 784,000 customers, a satisfaction score of over 90%, and has been named the “Best Online Bank” by Money magazine for the last four years.
Ally was an early adopter of the DMM, conducting a broad-based evaluation of its data management practices, and creating a strategy and sequence plan for improvements based on the results. Ally’s implementation of an integrated, organization-wide data management program including data governance, a robust data quality program, and managed data standards, resulted in a “Satisfactory” rating on its latest regulatory audit.
In this webinar, you will learn:
How Ally employed the DMM to evaluate its data management practices
Who was involved / lessons learned
How Ally prioritized and sequenced data management improvement initiatives
How the data management program has been enhanced and expanded
Business impacts and benefits realized
Major initiatives completed and underway
How Ally is leveraging DMM 1.0 to proactively prepare for BCBS 239 compliance.
A successful data governance capability requires a strategy to align regulatory drivers and technology enhancement initiatives with business needs and objectives, taking into account the organizational, technological and cultural changes that will need to take place.
Fuel your Data-Driven Ambitions with Data GovernancePedro Martins
The document discusses the importance of data governance and provides an overview of how to implement an effective data governance program. It recommends obtaining executive sponsorship, aligning objectives to business initiatives, prioritizing initiatives, getting frameworks ready, and socializing the program. The document outlines data governance building blocks, including assessing maturity, developing a master plan, selecting tools, and establishing an organizational framework. It also discusses preparing an organization for success with data governance.
Cognitivo - Tackling the enterprise data quality challengeAlan Hsiao
Competing effectively in the digital age means being data-driven to make the right long term and short term decisions. However the quality of your decisions will be proportional to the quality of your facts. Data quality is the critical stable foundation for your organisation to transition to a data-driven and AI enabled organisation.
Data Quality_ the holy grail for a Data Fluent Organization.pptxBalvinder Hira
This document discusses the importance of data quality for organizations. It notes that many organizations struggle with issues like not being able to perform root cause analysis on failures due to poor quality data. The document defines data quality as the degree to which data fulfills its intended purpose. It discusses common data quality issues like missing context, non-uniform definitions, and under-estimating impact. The document also outlines a conceptual data quality framework and discusses ensuring data quality across the data pipeline from collection to downstream use.
Getting Data Quality Right
High quality data is important for organizational success, but achieving good data quality requires a programmatic approach. Data quality challenges are often the root cause of IT and business failures. To improve, organizations need to take a systems thinking approach, understand data issues over time, and not underestimate the role of culture. Developing repeatable data quality capabilities and expertise can help organizations identify problems, determine causes, and prevent future issues. Effective data quality engineering provides a framework for utilizing data to support business strategy and goals.
Dun & Bradstreet collects data from over 30,000 global sources to maintain records on over 233 million companies worldwide. They use a proprietary process called DUNSRight that involves global data collection, entity matching, assigning D-U-N-S numbers, determining corporate linkages, and using predictive indicators to ensure high data quality. Data.com also sources data from its community of over 2 million members who contribute and update contact records, as well as validating data through technological processes. Both Data.com and Dun & Bradstreet have dedicated teams that continuously monitor data quality and make improvements to their methods.
The Chief Data Officer's Agenda: The Need for Information Governance ControlsDATAVERSITY
Information professionals are working in a proverbial fishbowl, always under watchful scrutiny. Increases in regulations, data standards, and competitive environments, coupled with an explosion of information and data assets across the organization, have led to significant challenges in the world of information. Financial regulations such as Basel III are redefining the control landscape. The Affordable Care Act, and extensions within HIPPA, are redefining Healthcare. Data Governance is maturing into Information Governance based on further expansions into business operations, legal, records management, information security and an awareness of the need to focus on derived information over pure data. Against this backdrop, CXO executives are demanding more and more controls prior to automating financial or industry reporting to provide assurance that information is accurate. This webinar explores the nature of Information Governance controls and how the controls can lead to operational efficiencies and reduction of multiple enterprise risk factors.
This webinar will provide an overview of:
The emergence of Information Governance
The nature of typical information landscape across an organization
The definition and need for Information Governance Controls
How Information Governance Controls drive operational efficiency and risk reduction
Critical success factors to integrated controls into overall Information Governance frameworks
In today's competitive market, many organizations are unaware of the quantity of poor-quality data in their systems. Some organizations assume that their data is of adequate quality, although they have conducted no metrical or statistical analysis to support the assumption. Others know that their performance is hampered by poor-quality data, but they cannot measure the problem.
Trillium software garp march 2014 presentation bfast briefingTrillium Software
This document discusses the challenges financial firms face with increasing data and regulatory requirements. It notes the growing data volumes and need for better data quality to avoid regulatory issues. However, IT is often a bottleneck, unable to provide the flexibility and visibility needed. The document advocates establishing data governance with clear roles and standards, profiling data to define issues, and providing transparency through dashboards. It describes a case study where a bank implemented a self-service data assurance system using business rules to enable risk teams to validate data for monthly regulatory submissions independently of IT, with measurable process improvements. Overall it promotes business empowerment and access to data as critical for effective governance.
Increasing Agility Through Data VirtualizationDenodo
This document discusses how data virtualization can help enterprises address data management challenges by providing a single source of truth, reducing data proliferation, enabling standardization and improving data quality. It describes how financial institutions face increased regulatory scrutiny around data practices. The solution presented is a Data Services Layer that acts as a common provisioning point for accessing authoritative data sources using technologies like data virtualization. Effective data governance is also emphasized as critical to the success of any data virtualization effort.
Oracle Application User Group sponsored Collaborate 2009 Presentation 'Building a Practical Strategy for Managing Data Quality' by Alex Fiteni CPA, CMA
Data-Ed Webinar: Data Quality Success StoriesDATAVERSITY
Organizations must realize what it means to utilize data quality management in support of business strategy. This webinar will demonstrate how chronic business challenges can often be attributed to the root problem of poor data quality. Showing how data quality should be engineered provides a useful framework in which to develop an effective approach. Establishing this framework allows organizations to more efficiently identify business and data problems caused by structural issues versus practice-oriented defects; giving them the skillset to prevent these problems from re-occurring.
Learning Objectives:
Understanding foundational data quality concepts based on the DAMA DMBOK
Utilizing data quality engineering in support of business strategy
Case Studies illustrating data quality success
Data quality guiding principles & best practices
Steps for improving data quality at your organization
Data Quality Management: Cleaner Data, Better Reportingaccenture
This document discusses Accenture's regulatory reporting framework and offerings around data quality management. It provides an overview of Accenture's high-performance financial reporting framework, which aims to consolidate frameworks, processes, and technology to create efficiencies across reporting functions. It also summarizes Accenture's regulatory reporting offerings, including data quality management, capability design, target operating models, and regulatory reporting vendor implementation support. Finally, it covers key aspects of data quality management, such as issue classification, management processes, governance structures, root cause analysis, and issue prioritization. The goal is to help financial institutions improve data quality, reporting accuracy and efficiency.
Data Virtualization for Compliance – Creating a Controlled Data EnvironmentDenodo
CIT modernized its data architecture in response to intense regulatory scrutiny. In this presentation, they present how data virtualization is being used to drive standardization, enable cross-company data integration, and serve as a common provisioning point from which to access all authoritative sources of data.
This presentation is part of the Fast Data Strategy Conference, and you can watch the video here goo.gl/CCqUeT.
Business Intelligence (BI) solutions can help manufacturing business users to analyse cost factors and make appropriate decisions for acquisition of raw material and sold goods.
This presentation contains our view on how data can be Strategically managed and stewarded in an organization, and the categories where rules can be applied to facilitate that process.
Reinvent Your Data Management Strategy for Successful Digital TransformationDenodo
This document discusses reinventing data management strategies for digital transformation. It notes that IT spends a large amount on ETL and storage but most data is not used. It also notes a growing gap between business needs for fast data access and analysis and IT's ability to provide it. The document proposes data virtualization as a solution to give both business and IT agility by providing unified access to all data sources. It provides examples of how data virtualization helped organizations like Indiana University and HUD improve strategic decision making and prevent fraud.
From Compliance to Customer 360: Winning with Data Quality & Data GovernancePrecisely
Winning football teams will dominate opponents both defensively and offensively. Similarly, the most successful businesses will best utilize enterprise data for effective “defense” (e.g., compliance, such as GDPR and CCAR) as well as “offense” (increased customer engagement and revenue).
View our on-demand webcast and discover how integrated data quality and data governance tools help you confidently achieve regulatory compliance, as well as revenue-building initiatives requiring a 360-degree view of your customers.
Data management experts Ian Rowlands, Product Marketing Manager of ASG and Harald Smith, Director, Product Management of Trillium Software discusses how Trillium Software for data quality, integrated with ASG’s Enterprise Data Intelligence solution, helps you pinpoint where data quality impacts your business, ensuring your enterprise data can be trusted to drive regulatory compliance as well as better business decisions.
View this on-demand webcast to learn how to:
• Improve data quality by leveraging data lineage maps
• Gain insight into where data quality gaps may exist, which may impact regulatory compliance and customer engagement initiatives
• Understand how changes may affect critical data elements and data quality
The document discusses business intelligence (BI) tools, data warehousing concepts like star schemas and snowflake schemas, data quality measures, master data management (MDM), and business intelligence competency centers (BICC). It provides examples of BI tools and industries that use BI. It defines what a BICC is and some of the typical jobs in a BICC like business analyst and BI programmer.
Similar to Infogix BCBS 239 Implementation Challenges (20)
Analysis insight about a Flyball dog competition team's performanceroli9797
Insight of my analysis about a Flyball dog competition team's last year performance. Find more: https://github.com/rolandnagy-ds/flyball_race_analysis/tree/main
State of Artificial intelligence Report 2023kuntobimo2016
Artificial intelligence (AI) is a multidisciplinary field of science and engineering whose goal is to create intelligent machines.
We believe that AI will be a force multiplier on technological progress in our increasingly digital, data-driven world. This is because everything around us today, ranging from culture to consumer products, is a product of intelligence.
The State of AI Report is now in its sixth year. Consider this report as a compilation of the most interesting things we’ve seen with a goal of triggering an informed conversation about the state of AI and its implication for the future.
We consider the following key dimensions in our report:
Research: Technology breakthroughs and their capabilities.
Industry: Areas of commercial application for AI and its business impact.
Politics: Regulation of AI, its economic implications and the evolving geopolitics of AI.
Safety: Identifying and mitigating catastrophic risks that highly-capable future AI systems could pose to us.
Predictions: What we believe will happen in the next 12 months and a 2022 performance review to keep us honest.
Unleashing the Power of Data_ Choosing a Trusted Analytics Platform.pdfEnterprise Wired
In this guide, we'll explore the key considerations and features to look for when choosing a Trusted analytics platform that meets your organization's needs and delivers actionable intelligence you can trust.
Global Situational Awareness of A.I. and where its headedvikram sood
You can see the future first in San Francisco.
Over the past year, the talk of the town has shifted from $10 billion compute clusters to $100 billion clusters to trillion-dollar clusters. Every six months another zero is added to the boardroom plans. Behind the scenes, there’s a fierce scramble to secure every power contract still available for the rest of the decade, every voltage transformer that can possibly be procured. American big business is gearing up to pour trillions of dollars into a long-unseen mobilization of American industrial might. By the end of the decade, American electricity production will have grown tens of percent; from the shale fields of Pennsylvania to the solar farms of Nevada, hundreds of millions of GPUs will hum.
The AGI race has begun. We are building machines that can think and reason. By 2025/26, these machines will outpace college graduates. By the end of the decade, they will be smarter than you or I; we will have superintelligence, in the true sense of the word. Along the way, national security forces not seen in half a century will be un-leashed, and before long, The Project will be on. If we’re lucky, we’ll be in an all-out race with the CCP; if we’re unlucky, an all-out war.
Everyone is now talking about AI, but few have the faintest glimmer of what is about to hit them. Nvidia analysts still think 2024 might be close to the peak. Mainstream pundits are stuck on the wilful blindness of “it’s just predicting the next word”. They see only hype and business-as-usual; at most they entertain another internet-scale technological change.
Before long, the world will wake up. But right now, there are perhaps a few hundred people, most of them in San Francisco and the AI labs, that have situational awareness. Through whatever peculiar forces of fate, I have found myself amongst them. A few years ago, these people were derided as crazy—but they trusted the trendlines, which allowed them to correctly predict the AI advances of the past few years. Whether these people are also right about the next few years remains to be seen. But these are very smart people—the smartest people I have ever met—and they are the ones building this technology. Perhaps they will be an odd footnote in history, or perhaps they will go down in history like Szilard and Oppenheimer and Teller. If they are seeing the future even close to correctly, we are in for a wild ride.
Let me tell you what we see.
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Data and AI
Round table discussion of vector databases, unstructured data, ai, big data, real-time, robots and Milvus.
A lively discussion with NJ Gen AI Meetup Lead, Prasad and Procure.FYI's Co-Found
Learn SQL from basic queries to Advance queriesmanishkhaire30
Dive into the world of data analysis with our comprehensive guide on mastering SQL! This presentation offers a practical approach to learning SQL, focusing on real-world applications and hands-on practice. Whether you're a beginner or looking to sharpen your skills, this guide provides the tools you need to extract, analyze, and interpret data effectively.
Key Highlights:
Foundations of SQL: Understand the basics of SQL, including data retrieval, filtering, and aggregation.
Advanced Queries: Learn to craft complex queries to uncover deep insights from your data.
Data Trends and Patterns: Discover how to identify and interpret trends and patterns in your datasets.
Practical Examples: Follow step-by-step examples to apply SQL techniques in real-world scenarios.
Actionable Insights: Gain the skills to derive actionable insights that drive informed decision-making.
Join us on this journey to enhance your data analysis capabilities and unlock the full potential of SQL. Perfect for data enthusiasts, analysts, and anyone eager to harness the power of data!
#DataAnalysis #SQL #LearningSQL #DataInsights #DataScience #Analytics
Enhanced Enterprise Intelligence with your personal AI Data Copilot.pdfGetInData
Recently we have observed the rise of open-source Large Language Models (LLMs) that are community-driven or developed by the AI market leaders, such as Meta (Llama3), Databricks (DBRX) and Snowflake (Arctic). On the other hand, there is a growth in interest in specialized, carefully fine-tuned yet relatively small models that can efficiently assist programmers in day-to-day tasks. Finally, Retrieval-Augmented Generation (RAG) architectures have gained a lot of traction as the preferred approach for LLMs context and prompt augmentation for building conversational SQL data copilots, code copilots and chatbots.
In this presentation, we will show how we built upon these three concepts a robust Data Copilot that can help to democratize access to company data assets and boost performance of everyone working with data platforms.
Why do we need yet another (open-source ) Copilot?
How can we build one?
Architecture and evaluation
2. Supervisory ReviewsRisk ReportingRisk Aggregation
Bank Challenges in Addressing BCBS 239
2
BCBS 239 Pillars
Data ArchitectureData Governance
Business Challenges
BCBS is not a cook book –Best practices round up to be followed from Ground up
Past M&A created overly complex and rigid IT landscape
Need to find consensus on data issues like DQ, data definitions, data availability,
accountability, storage and retrieval process
Complexity, size and availability of data used by banks lead to a lack of data adaptability
- size is a paradoxical factor
Aggregating data on a cross-border level and reconciliation of the data at legal entity level
is more complex
Risk reports that are produced within IT landscapes are generally standardized, database
specific reports with predefined frequencies and parameters
Banks find it difficult to determine how authorities are going to assess their compliance
Data IntegrityData Quality Metadata
Management
Data Integration
Data Reconciliation Data Aggregations
Multi Level
Reporting
Information
Architecture
Information
Governance
3. Data Quality Data Integrity Architecture Governance Aggregation Reporting Integration Reconciliation Metadata
G-SIB’s
In place- not
complete
Minimal–but
more manual
Progressive Progressive In place –
Complex
In Place –
Complex
In place –
Mature
In place –
Selective
Progressive
D-SIB’s
In place-not
complete
Minimal–but
more manual
Evolving Evolving In place –
Complex
In Place –
Complex
In place –
Mature
In place –
Selective
Progressive
Banks Maturity in US
3
4. Challenges in D-SIB’s for Governance Team
4
We have it – but we are not sure how much more we should
invest in DQ and where?
We thought about it – We haven’t done because of time to
deliver when we built out ETL
Yes – it is complex, We need more architects – but
re-architecting is a great idea ?
Data Quality
Data Integrity
Architecture
In place-not complete
Minimal – but more manual
Evolving
5. Challenges in D-SIB’s for Governance Team
5
We have our policy in place – We need to be more operational.
Need more metrics/ inventory – I would like to know what went in
daily and who is using what?
Its complex, done at multiple source levels – We have
documented it – but not sure how it works end-to-end?
Sure we have more reports than our Eemployees count! – but
it’s a challenge to make my business trust my reporting data
and do fact based decisions
Governance
Aggregation
Reporting
Evolving
In Place – Complex
In Place – Complex
6. Challenges in D-SIB’s for Governance Team
6
We have a great ETL Team and tools – we also have too many
jobs to manage and monitor, our support is complex
Sure we do it for our financial books – but not across all the
process gates for the risk data and reporting?
Metadata tool is in place – We have business glossary, but it’s a
challenge to keep up with the changes and metadata in sync
Integration
Reconciliation
Metadata
In Place - Mature
In Place – Selective
Progressive
8. BCBS 239 – Collect, Measure, Monitor and Report
Collect – Build Inventory of key risk data used in data aggregation and reporting
• Risk data elements, risk parameters, counterparty information, risk model parameters – needs
to be identified both in standardized form and how they are represented in the origination
system
• Group data elements to KDE, DDE, SDE and MDE’s to assignee right DQ gradient to
measure the data points usefulness
• Validate what governance policy and process influence the KDE & DDE data points and
capture/report audit points to manage the policy adherence
• Report data inventories corresponding to KDE/DDE/SDE origination source to ensure
frequency and timeliness parameters are met
• Connect – Key data attributes with metadata and DQ initiatives to bring in transparency
• Build application level data inventory and connect application level inventory to produce
enterprise level data inventories
• Refresh your data inventory through automation – to keep information metadata more current
• Assignee owners/custodians of the data, provide them with the visibility reports on data
metrics
• If possible, attach data metrics around each fields with your metadata tools – to bring in data
related trust on the attributes
8
9. BCBS 239 – Collect, Measure, Monitor and Report
Origination Source Data Management Apps Data Aggregation Rule Reporting Systems Audits/External Extracts
Control Metadata
Data Metrics Measured and Captured EndEnd
Metadata
Reporting &
Dashboards
• Capture metrics such as Counts, Volumes,
Date Ranges for a given data set – and
compare it with your organization
benchmarks/SLA etc. – this can be your
early warning system!
• Look at source data patterns, trends –
before and after aggregation rules –
Compare them with benchmark expected
results – to ensure your aggregation is
working as expected
• Reconcile your data sets with the golden
source of the data – build as many
reconciliation steps as possible in your flow
– this can help improve data trust with
business teams
• Identify specific data Quality parameters for each critical data
elements and always look for deviations from normal
• Build a consolidate Application Level Data trust score, make
Application owners responsible for managing the scores.
9
Measure your data that is flowing in from different applications and used by
reporting teams – If you can not quantify, you can not govern well
10. BCBS 239 – Collect, Measure, Monitor and Report
• Leverage analytics to predict out of
boundary data issues and resolve data
issues
• Review Data Quality Scores daily – check
if the score change warrants an
investigation
• Benchmark Industry data points if any – to
validate any adverse changes in key data
elements
• Monitor recon difference at each points in
the data movement process
• Share monitoring report to business teams
if there are data issues above threshold
10
Monitor the data flows daily to catch any data anomalies or abreactions in
consolidated data metrics.
11. BCBS 239 – Collect, Measure, Monitor and Report
• Provide end-to-end trend reporting dashboards – to
business users each time they would like to make
decision using the data – this can help build data
confidence
• Compare data issues from support group with the trend
deviations to make insight into data issue root cause
• Look at reconciliation deviations by trends – understand
the different patterns to improve accuracy parameters
• Share reports to source application teams – explore
DQ initiatives that can be focused and provide high
value in ROI
• Explore option of using analytics to predict data issues
before they come – plan to mitigate data issues before
$175
$200
$250
$300
$350
$100
$125
$170
$200
$275
$300
$350
$400
$450
$525
$275
$250
$350
$325
$375
$- $200 $400 $600 $800 $1,000 $1,200 $1,400 $1,600 $1,800
2011
2012
2013
2014
2015
Exposures in $ Mn
Years
Y0Y Retail Exposure Growth
-50
0
50
100
150
200
250
300
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21
Daily Diff GL Balance Src Balance
11
Report Data trends on your risk KDE data
13. Infogix – Control & Reporting
13
We can provide controls at each points – to measure, monitor and promote Data Integrity, DQ
improvements, Data Trust, Data Transparency and help you govern your data better.
Origination Source Data Management Apps Data Aggregation Rule Reporting Systems Audits/External Extracts
Control Metadata
Data Metrics Measured and Captured EndEnd
Metadata Reporting & Dashboards
14. Infogix Demo – Loan Origination Example
1. Consider a Loan Origination System which has
multiple source of originations like e-comers, 3rd
party, MF systems
2. Its common to see your Loan Application Apps (3rd
party application) that are used to track/approve
loan request
3. Approved loans are posted in the Loan Processing
Systems – and data feed to EDW systems daily
4. Loan Processing System post loan data to
Financial GL systems, as well sends daily feeds to
EDW
5. Internal retail data marts used for loan reporting at
departmental level gets data feed from loan
systems and/or EDW
6. Risk data warehouse – gathers risk data points
from the EDW, loan systems, Financial GL – for
internal or external reporting
Challenges:
1. Lack of data level visibility between origination, application,
processing, financial books and reporting
2. Possibility of data integrity questions arises each time we have a
data issue – creating data trust challenges!
3. Discrepancies in loan reporting at GL level, departmental
reports and Enterprise Risk Reports
4. Entity Level Risk Roll ups – might not match at department level
risk exposures
5. Complex data transformations, ETL, aggregations – can some
time break information value – data quality with right parameter
is a challenge.
14
15. Loan Origination Example
15
Might need reconciliation at different level to ensure we report the same across the enterprise.
3rd party Loans
E-Loans
Loan Origination
Loan Approval
Systems
Unapproved Loans
Loan Processing
Systems Financial
GL System
P&L Reporting
Loan
Originations
Information
Management Loan reportingLoan Recon
Retail Reporting
Mart
Risk Warehouse
Enterprise
Warehouse
Internal Risk Reporting
Retail Dep. reporting
External FFIEC
Reporting
# Input
$ Input
# Input
$ Input # Exclude
$ Exclude
# Output
$ Output
# Loans
$ Loans
# Output
$ Output
# Output
$ Output
# Total Input
$ Total Input
# Input
$ Input
# Output
$ Output
# Excluded
$ Excluded
# Input
$ Input
# Input
$ Input
# Output
$ Output
# Output
$ Output
# Output
$ Output
# Output
$ Output
# Output
$ Output
# Input
$ Input
21. Infogix – Control & Reporting
Provide KPI Reporting by data groups
across DQ gradients
Ability to compare key elements
across gradients
Identifying trends and data occurrence
by each elements
21
22. 200+Customers
INCLUDING:
20
of the
Fortune
100
7
of the top 10
Commercial
Banks
6
of the top 10
P & C
Insurers
of the top 5
Health
Insurers
4
Million
wireless cable
& broadband
subscribers covered
400
30+years as a leader in
analyzing data across
the enterprise.
500+Employees (and growing)
revenue represented
by our customers.
›$1 Trillion
›96%ANNUAL
CUSTOMER
SATISFACTION
RATING
1Million+ total Infogix business rules
running for our customers.
72%
Managing Risk is #1
of customers report it as the
key benefit of utilizing Infogix.
›15
YEARS
average number of
years that customers have
partnered with Infogix.
22
23. Thank You
Visit www.infogix.com for more information
Or contact kparal@infogix.com for more
information
23
Infogix Balancing and Reconciliation
Editor's Notes
Reliance of Purpose built data infrastructure and Reporting
Banks rate their own compliance with risk reporting principles higher than their compliance with Governance, infrastructure and data aggregation principles.
Banks tent to appear compliant at group level or at the specific legal entity level – but lack same capability at different aggregation levels – hence do not meet adaptability requirements
2. IT Team in bank has many large scale – Ongoing Projects spanning multiple years – leading to resource availability issues and complex project dependencies – While data landscape keep evolving due to data explosion in the Financial sectors.
Mandatory Slide
Slide Purpose: Credibility
Example Talking Points:
To wrap it up, here’s a nice Infographic on Infogix.
The #1 reasons customers use Infogix is to Manage Risk – The Infogix Enterprise Data Analysis Platform turns data into a competitive advantage.
Over 32 years in Data Analysis business
We have 500 employees supporting over 200+ customers.
Interesting statistic: Over $1T in revenue is represented by the customer portfolio that we serve (200+ big name customers)
Over 1M controls/business rules are running in our customers to ensure data integrity with the outcome of trustworthy data
Or…We have over 1M controls/business rules at over 200 customers – Unknown to you, when you shop (retail, banking, insurance, healthcare), or bank your transaction probably went Infogix data analysis.
We truly believe in partnering with our customer and we are proud to say that our average customer has been with us for well over 15+ year. We hope to have the same opportunity with you.