1) The document discusses BCBS 239, a regulation from the Basel Committee on Banking Supervision that demands accurate and timely risk data reporting. It focuses on developing risk management capabilities rather than just compliance.
2) EY has developed a Risk Data Aggregation and Reporting (RDAR) Framework to help banks comply with BCBS 239 in a practical way by prioritizing capabilities and coordinating change efforts. The framework addresses key areas like data, processes, people and technology.
3) Banks face many challenges in coordinating regulatory changes from different rules with overlapping requirements. The RDAR Framework helps banks integrate priorities using common capability objectives to connect changes in a meaningful way.
This document discusses implementing the Basel Committee on Banking Supervision's Principles for Effective Risk Data Aggregation and Risk Reporting (BCBS-239). It begins with some questions and challenges around BCBS-239 implementation related to the scope of risk data aggregation, level of automation, compliance focus, regulator engagement, implementation guidance, target state definition, and progress measurement. The document then presents six themes for interpreting the BCBS-239 principles: materiality, flexibility, reconciliation and validation, transparency, automation and adaptation, and speed and confidentiality. It outlines a framework moving from siloed, manual arrangements to a totally integrated, automated environment. Benchmarks are suggested to measure compliance levels against this target state.
This document discusses the Basel Committee on Banking Supervision (BCBS) 239 principles for effective risk data aggregation and risk reporting (RDARR). Some key points:
- BCBS 239 aims to enhance banks' ability to identify and manage firm-wide risks by improving data aggregation capabilities and risk reporting, especially during a crisis.
- It applies not just to globally systemic banks but also domestic systemically important banks. Banks face an aggressive timeline for compliance, with globally systemic banks required to implement the principles in full by early 2016.
- Failure to comply could result in regulatory penalties, increased capital charges, and reputational risks. Compliance also provides opportunities to unlock strategic value across the organization through better risk management
In the backdrop of the financial crisis of 2008, BCBS has instituted a framework to ensure ability of banks and their respective supervisors to manage company and industry risks by leveraging data and to ensure a more robust data capabilities and mechanisms to support: (1) Decision Making, (2) Strategy Formulation, and (3) Reporting
The Diversity Imperative: 14th Annual Australian Chief Executive StudyPwC's Strategy&
This report provides insight into the 2013 Australian Chief Executive Study findings, compares the results to the global market and identifies trends. Our analysis looks at trends relating to performance and tenure; reasons for CEO turnover; and the number of insider appointments versus outsider appointments.
The document compares ITIL and COBIT frameworks. While ITIL focuses specifically on IT service management, COBIT has a broader scope covering governance of enterprise IT. Some key similarities and differences are:
- COBIT aims to guide enterprise governance and management of IT across the organization. ITIL provides guidance for IT service providers to enable business value.
- COBIT covers a broader scope including principles, policies, processes, organizational structures, culture and more. ITIL focuses specifically on five stages of the service lifecycle.
- Both are aligned in their approach to IT service management, with COBIT processes mapping closely to ITIL stages.
Data Strategy for Telcos : Preparedness and ManagementSouravRout
Telco‘s sit on a vast amount of data – both in terms of magnitude and variety. The Internet of Things (IoT) is set to magnify this spead and volume of data exponentially. As an organization, telco‘s use data across the board – network performance and optimization, marketing, product placements, pricing, plans, customer experience, fraud detection, etc. It thus becomes important to ensure data collection (and at the end, disposal where needed), processing, analytics and value creation are done uniformly across the organization.
The document discusses the importance of developing a data strategy before building a data warehouse. It defines a data strategy as a unified, organization-wide plan for using corporate data as a vital asset. The data strategy should address critical data issues like quality, metadata, performance, distribution, ownership, security and privacy. Developing a data strategy requires identifying strategic and operational decisions, aligning the strategy with business goals, and answering many questions across various data-related topics.
Check out this SlideShare to understand the challenges of BCBS 239 and learn ways to collect, measure, monitor and report on data to achieve better data integrity and data quality. Both G-SIBs and D-SIBS will learn how to help better govern their data.
This document discusses implementing the Basel Committee on Banking Supervision's Principles for Effective Risk Data Aggregation and Risk Reporting (BCBS-239). It begins with some questions and challenges around BCBS-239 implementation related to the scope of risk data aggregation, level of automation, compliance focus, regulator engagement, implementation guidance, target state definition, and progress measurement. The document then presents six themes for interpreting the BCBS-239 principles: materiality, flexibility, reconciliation and validation, transparency, automation and adaptation, and speed and confidentiality. It outlines a framework moving from siloed, manual arrangements to a totally integrated, automated environment. Benchmarks are suggested to measure compliance levels against this target state.
This document discusses the Basel Committee on Banking Supervision (BCBS) 239 principles for effective risk data aggregation and risk reporting (RDARR). Some key points:
- BCBS 239 aims to enhance banks' ability to identify and manage firm-wide risks by improving data aggregation capabilities and risk reporting, especially during a crisis.
- It applies not just to globally systemic banks but also domestic systemically important banks. Banks face an aggressive timeline for compliance, with globally systemic banks required to implement the principles in full by early 2016.
- Failure to comply could result in regulatory penalties, increased capital charges, and reputational risks. Compliance also provides opportunities to unlock strategic value across the organization through better risk management
In the backdrop of the financial crisis of 2008, BCBS has instituted a framework to ensure ability of banks and their respective supervisors to manage company and industry risks by leveraging data and to ensure a more robust data capabilities and mechanisms to support: (1) Decision Making, (2) Strategy Formulation, and (3) Reporting
The Diversity Imperative: 14th Annual Australian Chief Executive StudyPwC's Strategy&
This report provides insight into the 2013 Australian Chief Executive Study findings, compares the results to the global market and identifies trends. Our analysis looks at trends relating to performance and tenure; reasons for CEO turnover; and the number of insider appointments versus outsider appointments.
The document compares ITIL and COBIT frameworks. While ITIL focuses specifically on IT service management, COBIT has a broader scope covering governance of enterprise IT. Some key similarities and differences are:
- COBIT aims to guide enterprise governance and management of IT across the organization. ITIL provides guidance for IT service providers to enable business value.
- COBIT covers a broader scope including principles, policies, processes, organizational structures, culture and more. ITIL focuses specifically on five stages of the service lifecycle.
- Both are aligned in their approach to IT service management, with COBIT processes mapping closely to ITIL stages.
Data Strategy for Telcos : Preparedness and ManagementSouravRout
Telco‘s sit on a vast amount of data – both in terms of magnitude and variety. The Internet of Things (IoT) is set to magnify this spead and volume of data exponentially. As an organization, telco‘s use data across the board – network performance and optimization, marketing, product placements, pricing, plans, customer experience, fraud detection, etc. It thus becomes important to ensure data collection (and at the end, disposal where needed), processing, analytics and value creation are done uniformly across the organization.
The document discusses the importance of developing a data strategy before building a data warehouse. It defines a data strategy as a unified, organization-wide plan for using corporate data as a vital asset. The data strategy should address critical data issues like quality, metadata, performance, distribution, ownership, security and privacy. Developing a data strategy requires identifying strategic and operational decisions, aligning the strategy with business goals, and answering many questions across various data-related topics.
Check out this SlideShare to understand the challenges of BCBS 239 and learn ways to collect, measure, monitor and report on data to achieve better data integrity and data quality. Both G-SIBs and D-SIBS will learn how to help better govern their data.
The data architecture of solutions is frequently not given the attention it deserves or needs. Frequently, too little attention is paid to designing and specifying the data architecture within individual solutions and their constituent components. This is due to the behaviours of both solution architects ad data architects.
Solution architecture tends to concern itself with functional, technology and software components of the solution
Data architecture tends not to get involved with the data aspects of technology solutions, leaving a data architecture gap. Combined with the gap where data architecture tends not to get involved with the data aspects of technology solutions, there is also frequently a solution architecture data gap. Solution architecture also frequently omits the detail of data aspects of solutions leading to a solution data architecture gap. These gaps result in a data blind spot for the organisation.
Data architecture tends to concern itself with post-individual solutions. Data architecture needs to shift left into the domain of solutions and their data and more actively engage with the data dimensions of individual solutions. Data architecture can provide the lead in sealing these data gaps through a shift-left of its scope and activities as well providing standards and common data tooling for solution data architecture
The objective of data design for solutions is the same as that for overall solution design:
• To capture sufficient information to enable the solution design to be implemented
• To unambiguously define the data requirements of the solution and to confirm and agree those requirements with the target solution consumers
• To ensure that the implemented solution meets the requirements of the solution consumers and that no deviations have taken place during the solution implementation journey
Solution data architecture avoids problems with solution operation and use:
• Poor and inconsistent data quality
• Poor performance, throughput, response times and scalability
• Poorly designed data structures can lead to long data update times leading to long response times, affecting solution usability, loss of productivity and transaction abandonment
• Poor reporting and analysis
• Poor data integration
• Poor solution serviceability and maintainability
• Manual workarounds for data integration, data extract for reporting and analysis
Data-design-related solution problems frequently become evident and manifest themselves only after the solution goes live. The benefits of solution data architecture are not always evident initially.
Booz Allen Hamilton and Market Connections: C4ISR Survey ReportBooz Allen Hamilton
Booz Allen Hamilton partnered with government market research firm Market Connections, Inc. to conduct the survey of military decision-makers. The research examined the main features of Integrated C4ISR through Enterprise Integration: engineering, operations and acquisition. Two-thirds of respondents (65 percent) agree agile incremental delivery of modular systems with integrated capabilities can enable rapid insertion of new technologies.
The document discusses IT governance and provides an overview of key frameworks for IT governance, including ISO 38500 and COBIT. It begins by defining governance and describing how governance applies to IT. It then discusses why IT governance is important for organizations, noting benefits like ensuring strategic alignment between IT and business goals. The document also provides a detailed overview of the ISO 38500 standard for IT governance, describing its scope, framework and principles. It explains the standard's six principles of IT governance and provides examples. Overall, the document serves to introduce the topic of IT governance and some of the most relevant frameworks.
Improving Data Literacy Around Data ArchitectureDATAVERSITY
Data Literacy is an increasing concern, as organizations look to become more data-driven. As the rise of the citizen data scientist and self-service data analytics becomes increasingly common, the need for business users to understand core Data Management fundamentals is more important than ever. At the same time, technical roles need a strong foundation in Data Architecture principles and best practices. Join this webinar to understand the key components of Data Literacy, and practical ways to implement a Data Literacy program in your organization.
Data Governance — Aligning Technical and Business ApproachesDATAVERSITY
Data Governance can have a varied definition, depending on the audience. To many, data governance consists of committee meetings and stewardship roles. To others, it focuses on technical data management and controls. Holistic data governance combines both of these aspects, and a robust data architecture and associated diagrams can be the “glue” that binds business and IT governance together. Join this webinar for practical tips and hands-on exercises for aligning data architecture & data governance for business and IT success.
The document discusses elements of developing a business intelligence strategy, including understanding an organization's BI maturity level, aligning metrics and goals across different business units, establishing a Business Intelligence Competency Center, and determining whether to build a BI solution from scratch or purchase pre-built BI applications. It provides an overview of various components that should be considered when creating a comprehensive BI strategy.
Data Architecture - The Foundation for Enterprise Architecture and GovernanceDATAVERSITY
Organizations are faced with an increasingly complex data landscape, finding themselves unable to cope with exponentially increasing data volumes, compounded by additional regulatory requirements with increased fines for non-compliance. Enterprise architecture and data governance are often discussed at length, but often with different stakeholder audiences. This can result in complementary and sometimes conflicting initiatives rather than a focused, integrated approach. Data governance requires a solid data architecture foundation in order to support the pillars of enterprise architecture. In this session, IDERA’s Ron Huizenga will discuss a practical, integrated approach to effectively understand, define and implement an cohesive enterprise architecture and data governance discipline with integrated modeling and metadata management.
Data Catalogs Are the Answer – What is the Question?DATAVERSITY
Organizations with governed metadata made available through their data catalog can answer questions their people have about the organization’s data. These organizations get more value from their data, protect their data better, gain improved ROI from data-centric projects and programs, and have more confidence in their most strategic data.
Join Bob Seiner for this lively webinar where he will talk about the value of a data catalog and how to build the use of the catalog into your stewards’ daily routines. Bob will share how the tool must be positioned for success and viewed as a must-have resource that is a steppingstone and catalyst to governed data across the organization.
IT Governance – The missing compass in a technology changing worldPECB
Oladapo Ogundeji, CTO of Digital Jewels Ltd, gave a presentation on IT governance and its importance in today's technology changing world. He discussed that IT governance provides a formal process to define IT strategy and oversee its execution to achieve business goals. It also helps balance priorities like maximizing returns, increasing agility, and mitigating risks. Ogundeji covered frameworks like COBIT 5 and ISO 38500 that provide guidance on implementing IT governance and highlighted critical success factors like executive commitment, focus on execution, and competence in resources.
Data Governance Trends and Best Practices To Implement TodayDATAVERSITY
1) The document discusses best practices for data protection on Google Cloud, including setting data policies, governing access, classifying sensitive data, controlling access, encryption, secure collaboration, and incident response.
2) It provides examples of how to limit access to data and sensitive information, gain visibility into where sensitive data resides, encrypt data with customer-controlled keys, harden workloads, run workloads confidentially, collaborate securely with untrusted parties, and address cloud security incidents.
3) The key recommendations are to protect data at rest and in use through classification, access controls, encryption, confidential computing; securely share data through techniques like secure multi-party computation; and have an incident response plan to quickly address threats.
Joining Forces: Interagency Collaboration and "Smart Power"Booz Allen Hamilton
This document summarizes the findings of a survey of 268 federal employees regarding interagency collaboration and addressing global challenges. Key findings include: 1) While agencies like Defense, State, and USAID share overlapping missions, collaboration is uneven and has not met expectations for some; 2) Budget pressures may increase the need for collaboration but managers are less optimistic it will reduce costs; 3) Smart power approaches remain applicable but support has decreased in some areas; 4) Agencies report having the tools needed but collaborating most effectively with other agencies compared to private/non-profit partners.
Data Democratization for Faster Decision-making and Business Agility (ASEAN)Denodo
Watch full webinar here: https://bit.ly/3ogsO7F
Presented at 3rd Chief Digital Officer Asia Summit
The idea behind Data democratization is to enable every type of user in a company to have access to data and to ensure that there is no dependency on any single party that might create a bottleneck to data access. But this is easier said than done especially given the complex data management landscape that most organizations have today. Data virtualization is a modern data integration technique that not only delivers data in real time without replication but also simplifies data discovery, data exploration and navigating between related data sets.
In this on-demand session, you will understand how data virtualization enables enterprises to:
- Reduce up to 80% the time required to deliver data to the business adapted to the needs of each user
- Apply consistent security and governance policies across the self-service data delivery process
- Seamlessly implement the concept of 'Data Marketplace'
Graph databases provide the ability to quickly discover and integrate key relationships between enterprise data sets. Business use cases such as recommendation engines, social networks, enterprise knowledge graphs, and more provide valuable ways to leverage graph databases in your organization. This webinar will provide an overview of graph database technologies, and how they can be used for practical applications to drive business value.
Creating a clearly articulated data strategy—a roadmap of technology-driven capability investments prioritized to deliver value—helps ensure from the get-go that you are focusing on the right things, so that your work with data has a business impact. In this presentation, the experts at Silicon Valley Data Science share their approach for crafting an actionable and flexible data strategy to maximize business value.
You Need a Data Catalog. Do You Know Why?Precisely
The data catalog has become a popular discussion topic within data management and data governance circles. A data catalog is a central repository that contains metadata for describing data sets, how they are defined, and where to find them. TDWI research indicates that implementing a data catalog is a top priority among organizations we survey. The data catalog can also play an important part in the governance process. It provides features that help ensure data quality, compliance, and that trusted data is used for analysis. Without an in-depth knowledge of data and associated metadata, organizations cannot truly safeguard and govern their data.
Join this on-demand webinar to learn more about the data catalog and its role in data governance efforts.
Topics include:
· Data management challenges and priorities
· The modern data catalog – what it is and why it is important
· The role of the modern data catalog in your data quality and governance programs
· The kinds of information that should be in your data catalog and why
DAS Slides: Building a Data Strategy - Practical Steps for Aligning with Busi...DATAVERSITY
Developing a Data Strategy for your organization can seem like a daunting task. The opportunity in getting it right can be significant, however, as data drives many of the key initiatives in today’s marketplace: digital transformation, marketing, customer centricity, and more. This webinar will help de-mystify Data Strategy and Data Architecture and will provide concrete, practical ways to get started.
Building a Data Strategy – Practical Steps for Aligning with Business GoalsDATAVERSITY
Developing a Data Strategy for your organization can seem like a daunting task – but it’s worth the effort. Getting your Data Strategy right can provide significant value, as data drives many of the key initiatives in today’s marketplace, from digital transformation to marketing, customer centricity, population health, and more. This webinar will help demystify Data Strategy and its relationship to Data Architecture and will provide concrete, practical ways to get started.
The document discusses several frameworks for IT governance - COBIT, ITIL, and Val IT. It describes the key components and benefits of each framework. COBIT focuses on controls and metrics for IT processes, while ITIL provides guidance on service delivery and support. Using the frameworks together can provide a comprehensive approach to IT governance that establishes what should be done as well as how.
Data Architecture Strategies: Building an Enterprise Data Strategy – Where to...DATAVERSITY
The majority of successful organizations in today’s economy are data-driven, and innovative companies are looking at new ways to leverage data and information for strategic advantage. While the opportunities are vast, and the value has clearly been shown across a number of industries in using data to strategic advantage, the choices in technology can be overwhelming. From Big Data to Artificial Intelligence to Data Lakes and Warehouses, the industry is continually evolving to provide new and exciting technological solutions.
This webinar will help make sense of the various data architectures & technologies available, and how to leverage them for business value and success. A practical framework will be provided to generate “quick wins” for your organization, while at the same time building towards a longer-term sustainable architecture. Case studies will also be provided to show how successful organizations have successfully built a data strategies to support their business goals.
Earnestine is an enterprise architect at X Railways who is trying to establish a collaborative enterprise architecture practice. In the first chapters, she explores the existing organizational structures, processes, and applications to understand the current state which is inconsistent and fragmented across teams. In later chapters, she works to build relationships, gain support from leadership, and establish governance structures to facilitate a collaborative design process across the organization. Her goal is to challenge existing structures and help the business and IT teams work together to improve through a facilitated negotiation space.
The document discusses new regulations from the Basel Committee on Banking Supervision (BCBS 239) that will require banks to improve their risk data aggregation and reporting capabilities. BCBS 239 aims to ensure banks have an accurate and complete view of risks across the organization. It identifies three approaches for complying with the new rules: the "messy approach" of using manual workarounds, the "traditional approach" of keeping separate risk systems, and the "consolidated approach" of integrating risk data into a single system. The article argues that over the long run, only the consolidated approach can fully meet BCBS 239 requirements by providing a single, consistent view of risk data needed for effective risk management.
Smart Data Webinar: A semantic solution for financial regulatory complianceDATAVERSITY
In this webinar Mike will describe a practical semantics-based approach to regulatory compliance and reporting for the financial sector using a reference ontology such as the Financial Industry Business Ontology (FIBO). This approach links the reference ontology to existing data resources with minimal disruption to existing data assets. The webinar will describe the kind of ontology that is needed for this kind of application, the principles for building or extending a reference ontology and some of the challenges in mapping this to legacy data.
The data architecture of solutions is frequently not given the attention it deserves or needs. Frequently, too little attention is paid to designing and specifying the data architecture within individual solutions and their constituent components. This is due to the behaviours of both solution architects ad data architects.
Solution architecture tends to concern itself with functional, technology and software components of the solution
Data architecture tends not to get involved with the data aspects of technology solutions, leaving a data architecture gap. Combined with the gap where data architecture tends not to get involved with the data aspects of technology solutions, there is also frequently a solution architecture data gap. Solution architecture also frequently omits the detail of data aspects of solutions leading to a solution data architecture gap. These gaps result in a data blind spot for the organisation.
Data architecture tends to concern itself with post-individual solutions. Data architecture needs to shift left into the domain of solutions and their data and more actively engage with the data dimensions of individual solutions. Data architecture can provide the lead in sealing these data gaps through a shift-left of its scope and activities as well providing standards and common data tooling for solution data architecture
The objective of data design for solutions is the same as that for overall solution design:
• To capture sufficient information to enable the solution design to be implemented
• To unambiguously define the data requirements of the solution and to confirm and agree those requirements with the target solution consumers
• To ensure that the implemented solution meets the requirements of the solution consumers and that no deviations have taken place during the solution implementation journey
Solution data architecture avoids problems with solution operation and use:
• Poor and inconsistent data quality
• Poor performance, throughput, response times and scalability
• Poorly designed data structures can lead to long data update times leading to long response times, affecting solution usability, loss of productivity and transaction abandonment
• Poor reporting and analysis
• Poor data integration
• Poor solution serviceability and maintainability
• Manual workarounds for data integration, data extract for reporting and analysis
Data-design-related solution problems frequently become evident and manifest themselves only after the solution goes live. The benefits of solution data architecture are not always evident initially.
Booz Allen Hamilton and Market Connections: C4ISR Survey ReportBooz Allen Hamilton
Booz Allen Hamilton partnered with government market research firm Market Connections, Inc. to conduct the survey of military decision-makers. The research examined the main features of Integrated C4ISR through Enterprise Integration: engineering, operations and acquisition. Two-thirds of respondents (65 percent) agree agile incremental delivery of modular systems with integrated capabilities can enable rapid insertion of new technologies.
The document discusses IT governance and provides an overview of key frameworks for IT governance, including ISO 38500 and COBIT. It begins by defining governance and describing how governance applies to IT. It then discusses why IT governance is important for organizations, noting benefits like ensuring strategic alignment between IT and business goals. The document also provides a detailed overview of the ISO 38500 standard for IT governance, describing its scope, framework and principles. It explains the standard's six principles of IT governance and provides examples. Overall, the document serves to introduce the topic of IT governance and some of the most relevant frameworks.
Improving Data Literacy Around Data ArchitectureDATAVERSITY
Data Literacy is an increasing concern, as organizations look to become more data-driven. As the rise of the citizen data scientist and self-service data analytics becomes increasingly common, the need for business users to understand core Data Management fundamentals is more important than ever. At the same time, technical roles need a strong foundation in Data Architecture principles and best practices. Join this webinar to understand the key components of Data Literacy, and practical ways to implement a Data Literacy program in your organization.
Data Governance — Aligning Technical and Business ApproachesDATAVERSITY
Data Governance can have a varied definition, depending on the audience. To many, data governance consists of committee meetings and stewardship roles. To others, it focuses on technical data management and controls. Holistic data governance combines both of these aspects, and a robust data architecture and associated diagrams can be the “glue” that binds business and IT governance together. Join this webinar for practical tips and hands-on exercises for aligning data architecture & data governance for business and IT success.
The document discusses elements of developing a business intelligence strategy, including understanding an organization's BI maturity level, aligning metrics and goals across different business units, establishing a Business Intelligence Competency Center, and determining whether to build a BI solution from scratch or purchase pre-built BI applications. It provides an overview of various components that should be considered when creating a comprehensive BI strategy.
Data Architecture - The Foundation for Enterprise Architecture and GovernanceDATAVERSITY
Organizations are faced with an increasingly complex data landscape, finding themselves unable to cope with exponentially increasing data volumes, compounded by additional regulatory requirements with increased fines for non-compliance. Enterprise architecture and data governance are often discussed at length, but often with different stakeholder audiences. This can result in complementary and sometimes conflicting initiatives rather than a focused, integrated approach. Data governance requires a solid data architecture foundation in order to support the pillars of enterprise architecture. In this session, IDERA’s Ron Huizenga will discuss a practical, integrated approach to effectively understand, define and implement an cohesive enterprise architecture and data governance discipline with integrated modeling and metadata management.
Data Catalogs Are the Answer – What is the Question?DATAVERSITY
Organizations with governed metadata made available through their data catalog can answer questions their people have about the organization’s data. These organizations get more value from their data, protect their data better, gain improved ROI from data-centric projects and programs, and have more confidence in their most strategic data.
Join Bob Seiner for this lively webinar where he will talk about the value of a data catalog and how to build the use of the catalog into your stewards’ daily routines. Bob will share how the tool must be positioned for success and viewed as a must-have resource that is a steppingstone and catalyst to governed data across the organization.
IT Governance – The missing compass in a technology changing worldPECB
Oladapo Ogundeji, CTO of Digital Jewels Ltd, gave a presentation on IT governance and its importance in today's technology changing world. He discussed that IT governance provides a formal process to define IT strategy and oversee its execution to achieve business goals. It also helps balance priorities like maximizing returns, increasing agility, and mitigating risks. Ogundeji covered frameworks like COBIT 5 and ISO 38500 that provide guidance on implementing IT governance and highlighted critical success factors like executive commitment, focus on execution, and competence in resources.
Data Governance Trends and Best Practices To Implement TodayDATAVERSITY
1) The document discusses best practices for data protection on Google Cloud, including setting data policies, governing access, classifying sensitive data, controlling access, encryption, secure collaboration, and incident response.
2) It provides examples of how to limit access to data and sensitive information, gain visibility into where sensitive data resides, encrypt data with customer-controlled keys, harden workloads, run workloads confidentially, collaborate securely with untrusted parties, and address cloud security incidents.
3) The key recommendations are to protect data at rest and in use through classification, access controls, encryption, confidential computing; securely share data through techniques like secure multi-party computation; and have an incident response plan to quickly address threats.
Joining Forces: Interagency Collaboration and "Smart Power"Booz Allen Hamilton
This document summarizes the findings of a survey of 268 federal employees regarding interagency collaboration and addressing global challenges. Key findings include: 1) While agencies like Defense, State, and USAID share overlapping missions, collaboration is uneven and has not met expectations for some; 2) Budget pressures may increase the need for collaboration but managers are less optimistic it will reduce costs; 3) Smart power approaches remain applicable but support has decreased in some areas; 4) Agencies report having the tools needed but collaborating most effectively with other agencies compared to private/non-profit partners.
Data Democratization for Faster Decision-making and Business Agility (ASEAN)Denodo
Watch full webinar here: https://bit.ly/3ogsO7F
Presented at 3rd Chief Digital Officer Asia Summit
The idea behind Data democratization is to enable every type of user in a company to have access to data and to ensure that there is no dependency on any single party that might create a bottleneck to data access. But this is easier said than done especially given the complex data management landscape that most organizations have today. Data virtualization is a modern data integration technique that not only delivers data in real time without replication but also simplifies data discovery, data exploration and navigating between related data sets.
In this on-demand session, you will understand how data virtualization enables enterprises to:
- Reduce up to 80% the time required to deliver data to the business adapted to the needs of each user
- Apply consistent security and governance policies across the self-service data delivery process
- Seamlessly implement the concept of 'Data Marketplace'
Graph databases provide the ability to quickly discover and integrate key relationships between enterprise data sets. Business use cases such as recommendation engines, social networks, enterprise knowledge graphs, and more provide valuable ways to leverage graph databases in your organization. This webinar will provide an overview of graph database technologies, and how they can be used for practical applications to drive business value.
Creating a clearly articulated data strategy—a roadmap of technology-driven capability investments prioritized to deliver value—helps ensure from the get-go that you are focusing on the right things, so that your work with data has a business impact. In this presentation, the experts at Silicon Valley Data Science share their approach for crafting an actionable and flexible data strategy to maximize business value.
You Need a Data Catalog. Do You Know Why?Precisely
The data catalog has become a popular discussion topic within data management and data governance circles. A data catalog is a central repository that contains metadata for describing data sets, how they are defined, and where to find them. TDWI research indicates that implementing a data catalog is a top priority among organizations we survey. The data catalog can also play an important part in the governance process. It provides features that help ensure data quality, compliance, and that trusted data is used for analysis. Without an in-depth knowledge of data and associated metadata, organizations cannot truly safeguard and govern their data.
Join this on-demand webinar to learn more about the data catalog and its role in data governance efforts.
Topics include:
· Data management challenges and priorities
· The modern data catalog – what it is and why it is important
· The role of the modern data catalog in your data quality and governance programs
· The kinds of information that should be in your data catalog and why
DAS Slides: Building a Data Strategy - Practical Steps for Aligning with Busi...DATAVERSITY
Developing a Data Strategy for your organization can seem like a daunting task. The opportunity in getting it right can be significant, however, as data drives many of the key initiatives in today’s marketplace: digital transformation, marketing, customer centricity, and more. This webinar will help de-mystify Data Strategy and Data Architecture and will provide concrete, practical ways to get started.
Building a Data Strategy – Practical Steps for Aligning with Business GoalsDATAVERSITY
Developing a Data Strategy for your organization can seem like a daunting task – but it’s worth the effort. Getting your Data Strategy right can provide significant value, as data drives many of the key initiatives in today’s marketplace, from digital transformation to marketing, customer centricity, population health, and more. This webinar will help demystify Data Strategy and its relationship to Data Architecture and will provide concrete, practical ways to get started.
The document discusses several frameworks for IT governance - COBIT, ITIL, and Val IT. It describes the key components and benefits of each framework. COBIT focuses on controls and metrics for IT processes, while ITIL provides guidance on service delivery and support. Using the frameworks together can provide a comprehensive approach to IT governance that establishes what should be done as well as how.
Data Architecture Strategies: Building an Enterprise Data Strategy – Where to...DATAVERSITY
The majority of successful organizations in today’s economy are data-driven, and innovative companies are looking at new ways to leverage data and information for strategic advantage. While the opportunities are vast, and the value has clearly been shown across a number of industries in using data to strategic advantage, the choices in technology can be overwhelming. From Big Data to Artificial Intelligence to Data Lakes and Warehouses, the industry is continually evolving to provide new and exciting technological solutions.
This webinar will help make sense of the various data architectures & technologies available, and how to leverage them for business value and success. A practical framework will be provided to generate “quick wins” for your organization, while at the same time building towards a longer-term sustainable architecture. Case studies will also be provided to show how successful organizations have successfully built a data strategies to support their business goals.
Earnestine is an enterprise architect at X Railways who is trying to establish a collaborative enterprise architecture practice. In the first chapters, she explores the existing organizational structures, processes, and applications to understand the current state which is inconsistent and fragmented across teams. In later chapters, she works to build relationships, gain support from leadership, and establish governance structures to facilitate a collaborative design process across the organization. Her goal is to challenge existing structures and help the business and IT teams work together to improve through a facilitated negotiation space.
The document discusses new regulations from the Basel Committee on Banking Supervision (BCBS 239) that will require banks to improve their risk data aggregation and reporting capabilities. BCBS 239 aims to ensure banks have an accurate and complete view of risks across the organization. It identifies three approaches for complying with the new rules: the "messy approach" of using manual workarounds, the "traditional approach" of keeping separate risk systems, and the "consolidated approach" of integrating risk data into a single system. The article argues that over the long run, only the consolidated approach can fully meet BCBS 239 requirements by providing a single, consistent view of risk data needed for effective risk management.
Smart Data Webinar: A semantic solution for financial regulatory complianceDATAVERSITY
In this webinar Mike will describe a practical semantics-based approach to regulatory compliance and reporting for the financial sector using a reference ontology such as the Financial Industry Business Ontology (FIBO). This approach links the reference ontology to existing data resources with minimal disruption to existing data assets. The webinar will describe the kind of ontology that is needed for this kind of application, the principles for building or extending a reference ontology and some of the challenges in mapping this to legacy data.
Graph analysis of the European public tendersLinkurious
Linkurious is a graph visualization startup that helps companies understand graph data. They developed Linkurious Enterprise, an enterprise-ready graph visualization platform, and linkurious.js, an open-source JavaScript graph visualization library. As an example, they visualized data on European public tenders as a graph using Neo4j. This allowed them to explore connections between firms and identify contracts a firm's customers awarded to its competitors.
This document discusses how Diaku Axon can help organizations comply with the BCBS239 principles for effective risk data aggregation and risk reporting. It provides an overview of BCBS239 and its requirements, and then delves deeper into how Diaku Axon addresses each of the key principles for both risk and data management perspectives. It highlights how Diaku Axon can help establish governance, documentation, controls, and the ability to generate aggregated risk data on demand. It also discusses how Diaku Axon promotes collaboration across business disciplines, regulatory requirements, and enables periodic validation.
The document discusses data stewardship and how to improve it. It emphasizes formalizing accountability for managing data on behalf of an organization. Effective data stewardship requires designated people, processes, and tools to ensure stewards are responsible for governed data. The document provides tips for getting started with data stewardship, including focusing on use cases that provide value and establishing governance with a low compliance cost. It also discusses challenges such as informal stewardship in New Zealand and a DIY attitude among some IT staff.
ALLL Data Management - 2015 Risk Management SummitLibby Bierman
This document discusses key data elements that financial institutions need to collect and store to properly calculate allowances for loan and lease losses (ALLL) and comply with regulatory requirements. It covers loan-level data, collateral data, customer data, risk ratings, and historical loss rates. The document also discusses challenges related to data quality and availability, and preparations needed for the new Current Expected Credit Loss (CECL) model. Overall, it emphasizes the importance of centralized, accessible loan-level data for accurate ALLL calculations and regulatory reporting.
Data Governance, understand what you already know (IBM Global Business Services)IBM Danmark
This document discusses data governance and outlines IBM's leadership in this area. It summarizes that IBM created a Data Governance Council 5 years ago and now leads an Information Governance Community of 550 people working on global challenges. The community is updating the Data Governance Maturity Model to focus on business goals and outcomes by including both technical and process enablers.
Effective data strategies are important for risk management. Surveys from 2008 and 2010 show that companies have made progress in clearly defining objectives and risk appetite, and better integrating risk management into strategic planning. Companies are also moving from defensive to more proactive risk management aimed at business value. Technology has enabled more automated risk identification, analysis, quantification, reporting and monitoring. Experts emphasize the importance of board commitment, dedicated executive leadership, a risk-aware culture, stakeholder engagement, transparency, and using risk data to inform decision-making rather than just avoid risks. Both internal and external data from multiple sources are needed for risk management, but data quality, governance, availability and standards must be addressed.
Alignment: Office of the Chief Data Officer & BCBS 239Craig Milroy
Alignment: Office of the Chief Data Officer & BCBS 239. Alignment overview between OCDO framework and Principles for Effective Risk Data Aggregation and Risk Reporting.
This document discusses the BCBS 239 regulatory requirements for risk data aggregation and risk reporting. It outlines the key components of BCBS 239 including risk governance, infrastructure, data aggregation, and reporting. It also describes a risk data self-assessment diagnostic study that banks should conduct to evaluate their risk operating model, processes, data usage, and infrastructure in order to identify gaps and develop projects to address deficiencies to comply with BCBS 239. Finally, it presents a proposed unified risk data model and architecture to integrate risk data across different risk types and business units.
The document discusses six key questions organizations should ask about data governance: 1) Do we have a government structure in place to oversee data governance? 2) How can we assess our current data governance situation? 3) What is our data governance strategy? 4) What is the value of our data? 5) What are our data vulnerabilities? 6) How can we measure progress in data governance? It provides details on each question, highlighting the importance of leadership, benchmarks, strategic planning, risk assessment, and metrics in developing an effective data governance program.
Understand the nuances of data risk Management and their alignment with Data Management and Governance. Which organisation model will be a best fit to implement Data Risk Management in Governance.
This document reviews several existing data management maturity models to identify characteristics of an effective model. It discusses maturity models in general and how they aim to measure the maturity of processes. The document reviews ISO/IEC 15504, the original maturity model standard, outlining its defined structure and relationship between the reference model and assessment model. It discusses how maturity levels and capability levels are used to characterize process maturity. The document also looks at issues with maturity models and how they can be improved.
This document provides guidance for banks on measuring compliance with BCBS 239, a regulation aimed at improving risk data aggregation and reporting. It outlines three key challenges banks face in implementing BCBS 239: lack of quality data and infrastructure; increasing reporting demands; and measuring compliance with principles-based regulations. It then discusses Deloitte's proposed approach to identifying metrics and thresholds to measure compliance. Specifically, it provides examples of potential metrics for Principles 1 (data architecture and IT infrastructure) and 2 (data accuracy and integrity).
A Review of BCBS 239: Helping banks stay compliantHEXANIKA
Although the challenge to comply with BCBS 239 is vital, the scope is immense. Now that the Jan 2016 deadline for the G-SIBs is up, the rule is expected to extend to other financial institutions and banks. The principles will also apply to all key internal risk management models including market, credit, and counterparty risk. Establishing the principle guidelines and putting core capabilities in place has its merits.
The clarity that effective risk data aggregation provides will help banks streamline their businesses, and can allow banks to make better judgments through more accurate risk analysis. Aggregated information across all channels will enable to provide comprehensive support and services to existing customers. The robust data framework also helps banks supervise and anticipate future problems, giving them a clear view for data analysis.
It can lead to gains in efficiency, reduce probability of losses and enhance strategic decision making, ultimate benefiting a bank’s profitability.
Considerations for an Effective Internal Model Method Implementationaccenture
In this Accenture Finance & Risk presentation we discuss an approach banks can use to develop, manage, and monitor a robust and effective Internal Model Method program. Learn more about the Accenture Finance & Risk Practice: bit.ly/2j2JD6X
BCBS 239 Compliance: A Comprehensive ApproachCognizant
In 2013, the Basel Committee on Banking Supervision (BCBS) issued 14 principles for effectively aggregating risk data and reporting, with the goal of enabling banks to understand and address risk exposures that influence their major decisions. While Global Systemically Important Banks (GSIBs) have made progress in complying with BCBS 239, Domestic Systemically Important Banks (DSIBs) are still in the early stages.
IFRS 17 is a new global accounting standard for insurance contracts that will replace IFRS 4. It requires insurance companies to make significant changes to systems, processes, and financial reporting to be compliant by 2022. Accenture identifies four scenarios for architectural evolution to comply with IFRS 17, ranging from minimal changes to a full re-engineering of architecture to enable data-driven capabilities. Insurance companies must start implementation soon given the scale of changes required to meet the 2022 effective date.
This document discusses the challenges that lenders and special purpose vehicles face in meeting the new IFRS9 accounting regulation, which requires account-level provisioning rather than portfolio-level provisioning. It outlines how building statistical models with high quality account-level data can help meet IFRS9 requirements. HML, a large mortgage data and analytics company, can help lenders by building models and performing stress testing using its extensive mortgage data and expertise in account-level modelling. The document details the type of data needed and HML's process for building statistical models and scorecards to perform IFRS9 calculations at the account level.
The document discusses the future of risk management in banks over the next decade. It states that by 2025, risk functions will need to be fundamentally different and transformed more than in the last decade. Regulations will continue expanding while customer expectations rise. The risk function of the future will have broader responsibilities, stronger collaborative relationships, and expertise in analytics and collaboration over processes. IT and data will be more sophisticated using big data and algorithms. Risk decisions may be made at lower costs while improving customer experience. Banks need to prepare and rebuild risk functions now to thrive during this period of transformation.
This document discusses the need for banks to transform their risk management functions in response to increased regulation and scrutiny. It recommends a three step process for risk transformation: 1) Assess the current state of risk management capabilities, 2) Rationalize and prioritize risk capability objectives, and 3) Transform operations to optimize and deliver the target state. The document emphasizes the importance of collaboration between risk, finance, compliance and the business to establish clear ownership and integrated control frameworks. It also discusses how the Chief Risk Officer can take a leadership role in driving strategic risk management changes.
James Okarimia - IFRS Implementation and How the Banks should Approach IT.JAMES OKARIMIA
IFRS 9 will require significant changes to banks' accounting practices and risk management policies to timely recognize expected credit losses. It will impact governance, policies, processes, data and systems. Banks should treat implementation as a transformation program with three phases - assess current state, design new state, implement changes. This will require involvement from multiple teams across the bank to develop new methodologies, models, and governance around credit risk, impairment forecasting, and more. While the changes are substantial and timeline is tight, proper planning and cross-functional collaboration can help banks successfully complete IFRS 9 implementation.
IFRS Implementation and How the Banks should Approach ItJAMES OKARIMIA
This document discusses how banks should approach implementing IFRS 9 accounting standards. It notes that IFRS 9 implementation will require significant effort across governance, policies, processes, data, and systems. It recommends treating implementation as a transformation program with three phases: assess, design, implement. Key areas of impact include governance, policies and methodology, models, and data requirements. Challenges include a tight timeline, wide organizational impact, data needs, and complexity. Senior management must drive implementation from the top down with cross-functional teams and a focus on change management.
IFRS Implementation and How the Banks should Approach itJAMES OKARIMIA
This document discusses how banks should approach implementing IFRS 9 accounting standards. It notes that IFRS 9 implementation will require significant effort across governance, policies, processes, data, and systems. It recommends treating implementation as a transformation program with three phases: assess, design, implement. Key areas of impact include governance, policies and methodology, models, and data requirements. Challenges include a tight timeline, wide organizational impact, data needs, and complexity. Senior management must drive implementation from the top down with cross-functional teams and a focus on change management.
IFRS Implementation and How the Banks should approach itJAMES OKARIMIA
IFRS Implementation and how the banks should approach it.
Though the final version of IFRS came up in 2014, the banks across the globe have recently embarked the journey. Any new regulation requires significant effort to revisit its existing governance, policies & processes, data and the systems and IFRS 9 is no different.
IFRS Implementation and How the Banks should Approach itJAMES OKARIMIA
This document discusses how banks should approach implementing IFRS 9 accounting standards. It notes that IFRS 9 implementation will require significant changes across governance, policies, processes, data, and systems. It recommends treating implementation as a transformation program with three phases: assess, design, implement. Key areas of impact include governance structure, roles and responsibilities, credit and accounting policies, impairment and risk modeling methodology, and extensive data requirements. The document acknowledges challenges around the tight timeline, wide organizational impact, data needs, and complexity. It stresses the need for senior leadership support and alignment of finance, risk, and IT teams to successfully execute the program.
1) The document discusses the challenges that global systemically important banks (G-SIBs) face in complying with the BCBS239 regulation, which requires them to strengthen risk data aggregation and reporting.
2) A recent report found that many G-SIBs will be unable to fully comply with the regulation's principles by the 2016 deadline and still rely heavily on manual workarounds.
3) Complying with BCBS239 poses unique challenges for G-SIBs in areas like governance, data management, IT infrastructure, and legacy systems. Updating these systems at scale while meeting reporting demands has proven difficult.
James Okarimia - IFRS Implementation and How the Banks should Approach IT.JAMES OKARIMIA
This document discusses IFRS 9 implementation for banks. It provides an overview of IFRS 9 requirements, including classification and measurement of financial assets and liabilities, impairment methodology, and hedge accounting. It recommends banks take a transformation program approach with three phases: assess, design, implement. It identifies key areas of impact like governance, policies, methodology, models, and data. It also discusses challenges of the tight timeline, wide organizational impact, data needs, and complexity involved in IFRS 9 implementation.
Cognizant_Introduction to management consulting in Switzerlandaudrey miguel
Cognizant is launching management consulting services in Switzerland to help clients with strategy, business transformation, customer relationships, and risk management. Since 2004, Cognizant has provided these services primarily to banking, financial services, and insurance clients. The document outlines Cognizant's five specialized consulting practices and experience assisting clients with regulations like IFRS 9, Basel III, BCBS 239, and PRIIPS.
IFRS Implementation and How the Banks should Approach it.JAMES OKARIMIA
This document discusses how banks should approach implementing IFRS 9 accounting standards. It will require significant effort across governance, policies, processes, data, and systems. The implementation should be treated as a transformation program with three phases: assess current state, design new state, and implement changes. It will impact areas like organization structure, roles and responsibilities, credit and accounting policies, impairment and risk models, and extensive data requirements. While the timeline is steep and challenges are wide-ranging, the banks can still succeed if they immediately start the program, align senior management, and execute it properly.
IFRS Implementation and How the Banks should Approach It.JAMES OKARIMIA
IFRS Implementation and How the Banks should Approach It. A publication by James Okarimia
Managing Partner at RM associates
Partners in Enterprise Risk Managements
201310 Risk Aggregation and Reporting. More than Just a Data IssueFrancisco Calzado
Many banks feel overwhelmed by the sheer volume of regulation that is coming their way. It is not surprising, therefore, that when the Basel Committee on Banking Supervision (BCBS) consultative paper, “Principles for effective risk data aggregation and risk reporting” was published in June 2012 it raised a number of concerns
everis was Gold Sponsor of the Marcus Evans Conference ‘4th Edition: Impact of the Fundamental Review of the Trading Book’ at Canary Wharf, London on 23-24th February 2017.
This was a timely opportunity to catch up with banks and solution partners as we move into the implementation phase of Fundamental Review of the Trading Book (FRTB) programmes. We heard views and case studies across a range of topics including market risk methodology, operating model definition and data and systems architecture design.
Our presentation at the conference focused on the architectural challenges posed by FRTB.
Similar to BCBS 239 risk data aggregation reporting_Feb15_PRINT (20)
BCBS 239 risk data aggregation reporting_Feb15_PRINT
1. BCBS 239
Risk data aggregation
and reporting
A practical path to compliance and
delivering business value
Global Regulatory Reform
2. Contents
01 Banks can’t do it all by 2016. They need to prioritize and
make the right choices
01
02 Finding a meaningful and actionable framework to
deliver against BCBS 239 principles
03
03 Coordinating in a practical way and using BCBS 239
to your advantage
07
04 How EY can help 08
3. 1A practical path to compliance and delivering business value
01Banks can’t do it all by 2016. They need to
prioritize and make the right choices
The Basel Committee of Banking
Supervision (BCBS) 239 is different
from other regulations. It demands
that the information banks use to drive
decision-making captures all risks with
appropriate accuracy and timeliness.
By setting out overarching principles of
effective risk management reporting and
governance, BCBS 239 focuses banks
on developing the right capabilities
versus hitting a compliance date. BCBS
239 isn’t just about filling in another
reporting template. Even after January
2016 (the compliance date for Globally
Systemically Important Banks (G-SIBs))
it won’t go away.
Regulators are viewing BCBS 239
compliance through multiple lenses
Stress testing exercises such as the
Comprehensive Capital Analysis and
Review (CCAR) in the US, the Firm Data
Submission Framework (FDSF) in the
UK and the European Banking Authority
(EBA) stress tests across Europe have
emphasized the capability gaps banks
have to bridge. The resources required to
run these exercises are not sustainable.
Bank alignment to the principles will
also be challenged by other regulations
including the Fundamental Review of
the Trading Book (FRTB). If banks fail
to demonstrate compliant solutions for
data management, data governance and
alignment between risk, finance and the
business they will be forced to change
the way they model and value their risk.
Without change, the rules will require a
material increase in the level of capital
banks need to hold.
But it doesn’t stop there. Other
regulations such as the UK’s Senior
Manager Regime (SMR) will further
intensify Board and Senior Management
responsibility and accountability in the
banking sector with a major focus on
risk control. With the level of current
and future change, BCBS 239 can, and
should, be positioned at the heart of
coordinating regulatory transformation.
Banks have a lot of work ahead
Evolving the way banks operate and
adapting their supporting data and
technology infrastructures will require
a lot of work. Both the banks and their
regulators recognize the challenges
in fully aligning to the principles by
January 2016.
G-SIBs have mobilized and Domestic
Systemically Important Banks (D-SIBs)
are now mobilizing their approach to
achieve regulatory compliance. A recent
EY survey* on BCBS 239 readiness shows
that banks are viewing the principles as
an enabler for other strategic objectives
aimed at transforming the business to
survive in the new marketplace. Banks
are, however, challenged in making
the join between BCBS 239 principles,
specific capability-based requirements
and their existing book of work across
different functional areas, lines of
business and regions. This is often
evidenced by a limited number of BCBS
239 initiatives, especially at divisional and
regional levels.
With limited investment spend,
banks need to set the right
priorities for 2015
Demonstrating sufficient progress to
the regulator while also moving the
organization towards it’s existing strategic
goals will be key. Banks need a practical
method to address the challenges ahead:
•• Translate principles into meaningful
and measurable changes
•• Understand the gap to target
capabilities (calibrated against peers)
* EY’s BCBS 239 Autumn 2014 industry survey of
30 G-SIBs and D-SIBs on prioritizing and mobilizing
projects for 2015
•• Connect strategic change across risk,
finance, data and technology
•• Gain sufficient momentum in 2015
and beyond with the depth and
breadth of skills required to deliver
•• Measure and monitor progress to
January 2016 and demonstrate
sustained alignment to the principles
beyond 2016
Activities that deliver key
capabilities
EY’s experience of BCBS 239
indicates that banks are prioritizing
specific areas:
•• Data ownership and data quality
frameworks
•• Policy change
•• Critical risk process documentation
(including controls and key data
elements)
•• Service level agreements
•• Data dictionary and lineage
Avoid the cost of
non-compliance
Banks that continue to show
deficiencies in their risk
management capabilities may
experience increased intensity
of supervision and the possible
application of capital add-ons and
other limits on banks’ risk-taking
and growth opportunities.
4. Conclusion
y
2 A practical path to compliance and delivering business value
Percentageofrespondentspercategory
<25% 25%–50%
56%
22% 22%
50%–75% 75%–100%
Figure 1: What % of your BCBS 239-related change programs will be
completed by January 2016?
Key findings from EY’s BCBS
239 Autumn 2014 industry
survey of 30 G-SIBs and D-SIBs
on prioritizing and mobilizing
projects for 2015
•• Our survey (see figure 1) showed
that most respondents think a
significant part of their BCBS
change delivery will not be complete
by January 2016
•• 89% of respondents viewed BCBS
239 as an enabler to shape their
IT strategy and develop their IT
infrastructure.
•• 78% of respondents viewed
BCBS 239 as an enabler for their
enterprise-wide data management
capability objectives.
•• 67% of respondents viewed BCBS
239 as an initiative to help drive
operational efficiency and other
cost reduction initiatives.
5. 3A practical path to compliance and delivering business value
02Finding a meaningful and actionable framework to
deliver against BCBS 239 principles
Through our experience of working with
many G-SIBs and D-SIBs on their BCBS
239 self-assessment and compliance
planning activities, EY has developed a
Risk Data Aggregation and Reporting
(RDAR) Framework. Banks can use this
proven approach to manage change
successfully, by integrating and
delivering their BCBS 239 compliance
objectives alongside their strategic
investment program.
EY’s RDAR Framework presents a
capability objective view through which
banks can:
Clearly articulate BCBS 239 compliance
requirements and priorities1
Make the join across strategic
change programs2
Align other regulatory changes in a
coordinated way3
Rapidly form an approach to
successfully manage change4
Armed with a common language, banks
can use BCBS 239 as a lever to align
objectives and coordinate delivery.
Without such a Framework we have
seen banks struggle to get a common
understanding or consistency of response
and measurement across the enterprise.
Banks can use EY’s RDAR Framework
to create a common and meaningful
understanding of the changes required
across lines of business, functions and
regions, and an approach to deliver the
change in a coordinated and effective
way. Banks can use the same framework
to align strategic program objectives
alongside BCBS 239-specific execution-
level needs. Joining up the organization
around BCBS 239 has clear advantages,
both externally, in terms of demonstrating
a cohesive response to the regulator, and
internally, by prioritizing the development
of risk management capabilities.
EY’s RDAR Framework is not just theory;
it has been tried and tested with a number
of clients and has been used to identify
and define practical road maps towards
compliance.
RDAR
Framework
People
and
organization
Strategy
and
governance
Processes and
controls
Data
and
technology
Analytics
and
decision
support
Strategy and governance
Processes and controls People and organization
Analytics and decision supportData and technology
•• Data governance
•• Board and senior management
accountability
•• Risk policy and design principles
•• Stress and crisis adaptability
•• Key processes and controls
•• Reconciliations
•• Amendments and adjustments
•• Capability and performance
management
•• Resource alignment
•• Stress testing
•• Risk matrix and measures
•• Management information and reporting
•• IT strategy and governance
•• Golden sources and data lineage
•• IT architecture and systems
•• Data organization and definition
•• IT performance
•• Data quality
•• End-user computing
6. Conclusion
y
4 A practical path to compliance and delivering business value
How EY’s Risk Data Aggregation and Reporting
Framework can address banks’ challenges
As important regulatory changes
continue to sweep across the banking
landscape, both regulators and banks
want to do the right thing. They want
to invest in a coordinated and strategic
fashion to build a more performant and
stable banking environment. Banks face
a number of challenges from regulations
being expressed in different ways, and
implemented with different timescales and
overlapping jurisdictional applications.
Banks need to avoid a siloed response to
managing regulatory change if they are
to build the right platform to develop their
business, generate returns on regulatory
change spend and meet the challenge of
today’s market. To achieve this, banks
need to connect the complex array
of global regulatory changes in a way
that has relevance to how they operate
and that allows them to manage the
required change. EY’s RDAR Framework
supports this integration of priorities,
using common capability objectives
spread across the BCBS 239 principles to
create the join. The Framework is helping
banks to rapidly understand what BCBS
239 means to the way they operate,
moving from principles to business-ready
requirements.
Banks have work to do across all
components of the RDAR Framework.
It can help banks articulate and prioritize
change alongside organization-specific
objectives with a number of common
focus areas. The emerging priority areas
of investment for 2015 include:
•• Data ownership and data quality
frameworks
•• Policy change
•• Critical risk process documentation
(including controls and key data
elements)
•• Service level agreements
•• Data dictionary and lineage
At most banks, existing investment
programs have the potential to achieve
much of the progress required toward
BCBS 239 compliance. Mapping specific
BCBS 239 change demands against the
bank’s investment programs is critical
to leveraging the committed investment
in an efficient manner. With EY’s
RDAR Framework, banks can make
the connection across their large scale
finance, risk and data programs.
Using a common language of capabilities, joins are quickly identified, allowing for a rapid assessment of how much mutual value can
be realized — for both the BCBS 239 compliance work and the transformation programs.
RDAR Framework clearly articulates BCBS 239 compliance requirements and priorities
RDAR Framework can help banks make the join across strategic change programs
1
2
BCBS 239
Finance
•• Improve data quality
•• Harmonize data architecture
•• Harmonize data governance
•• Share data management and
reporting services
Risk
•• Commoditize risk reporting
activity
•• Reduce time spent on data issues
and generating reports
•• Embed risk control standards
•• Reduce cost and develop risk
utilities
Data
•• Improve data governance models
•• Improve data accountability
•• Implement data dictionary and
standards
•• Improve data quality
7. 5A practical path to compliance and delivering business value
There are a number of new and enhanced
requirements that are impacting the
reporting and disclosure landscape. Links
between them are complex, overlaps
are rife, and achieving consistency is
becoming more challenging. The direction
of travel is firmly toward a) greater
granularity, and b) more public disclosure.
There is also increasing overlap between
the demands placed on internal and
external reporting requirements. Getting
risk management and reporting right has
never been more critical.
Regulators now have access to more data
and insights which expose any lack of join
across an organization. EY have identified
clear overlaps and specific capability
requirements across regulations by
aligning different regulatory requirements
to the principles. For example, the review
of the Pillar 3 disclosure requirements and
BCBS 239. Cross-regulatory change can
be coordinated and managed under the
RDAR Framework.
RDAR Framework aligns other regulatory changes in a coordinated way3
Percentageofrespondentspercategory
67%
33%
No or minimal
impact
Yes, minimal
considerations only
Yes, integral to
future state design
Align BCBS 239 and other
regulatory initiatives
Our survey (see figure 2), found that
all respondents stated that other
regulatory initiatives such as FDSF,
CCAR and EBA stress testing are
affecting their response to BCBS 239.
Two-thirds of banks are aligning cross-
regulatory requirements as an integral
part of their future state design.
Figure 2: Are regulatory initiatives such as FDSF, CCAR and EBA stress
testing affecting your BCBS 239 implementation?
There is a strong link between BCBS 239 and other regulatory reporting and
disclosure developments. Important considerations include:
•• Increasing focus on risk and finance
data integration
•• Developing common reporting
processes and controls and aligning
regulatory reporting and financial
statement information
•• Increasing visibility and potential
impact (operational overheads)
to manage, control and monitor
adjustments and amendments
•• Growing demand to improve the
connectivity of information and
transparency of data lineage,
including aggregation and
transformation logic
•• Optimizing data granularity and
dimensionality capabilities
•• Promotion of data standards —
internal and external
•• Developing strong data governance
and data quality
•• Improving the consistency and
reliability of risk exposures and
forecasts — explicit comparisons
of hypothetical and actual trading
outcomes
•• Implementing new risk measures,
e.g., resilience measures and
dashboard of key metrics
8. Conclusion
y
6 A practical path to compliance and delivering business value
Planning compliance
Once requirements have been grouped
using EY’s RDAR Framework, banks can
establish and sequence logical change
components:
•• Self-assessment findings (where are
the biggest gaps to bridge?)
•• Relative benefits vs. time and
complexity of change
•• Alignment across other regulatory
developments
•• Alignment to other large-scale
investment programs
•• Other organization-specific priorities
Demonstrating continued
compliance
Banks need to track delivery against the
deadlines agreed with their supervisor.
But how many are looking to measure
their continued compliance with the
principles beyond that point? Banks need
to put in place mechanisms to measure
continued compliance after project teams
have been stood down.
EY’s RDAR Framework approach has
helped a number of banks to develop a
traceable mechanism to demonstrate
progress against agreed plans. These
solutions focus primarily on program
alignment and project tracking. In addition
to base-level BCBS 239 progress tracking,
EY has developed solutions to take
banks beyond compliance deadlines, to
meet both continued regulatory scrutiny
and the bank’s own requirements to
deliver sustained business value and
effectiveness. These are not “nice to
have” options; regulators expect banks
to perform internal reviews of their BCBS
239 programs and to be prepared for
reviews by their regulators.
RDAR Framework rapidly forms an approach to manage change successfully4
9. 7A practical path to compliance and delivering business value
03
EY’s RDAR Framework can help capture
and structure the changes demanded
by BCBS 239. Banks should look to use
the principles of BCBS 239 to coordinate
investment in people, process, technology
and data. Without a structured approach
to join up, prioritize and sequence change
demands across the organization, banks
will deliver regulatory change inefficiently
at best; at worst, it will be non-compliant
and ineffective.
With the right approach, and steer from
the top, banks have an opportunity
to tackle long-standing problems
that restrict their ability to aggregate
risk exposures and deliver the right
information at the right time to support
their decision-making needs fully. Banks
need to deliver:
•• A common view of business activity
across risk and finance
•• Risk control standards in line with their
finance counterparts
•• Sustainable operating models to
manage increased frequency, volumes
and governance expectations
•• Radically improved data quality and
transparency across the management
of risk life cycle
Banks can view BCBS 239
as just another regulation
with which to comply.
Alternatively, banks can
use the principles to align
the way they operate
and direct their change
priorities across the
organization. With the
right approach, banks can
develop a practical path
to compliance and deliver
business value.
Coordinating in a practical way and using BCBS 239
to your advantage
10. Conclusion
y
8 A practical path to compliance and delivering business value
04How EY can help
EY has extensive experience helping
organizations navigate through
BCBS 239 and associated risk
management transformation. We
have supported a significant number
of G-SIBs and D-SIBs through self-
assessments, development of target
operating models, implementation
of road maps and program
mobilization.
Our network of former senior
regulators is supported by global
enterprise intelligence, risk and
finance technology enablement
teams, meaning EY brings a broad
range of experience and skills to
diagnose, design and implement
change successfully. EY’s BCBS 239
Framework helps banks ensure that
regulatory compliance commitments
are clearly articulated and aligned to
other strategic objectives underway
at the bank. It provides a set of
tools, techniques, methodologies
and approaches that enable and
accelerate BCBS 239 compliance
and the release of business value.
EY teams can rapidly identify focus
areas, then apply a set of BCBS
239-specific tools and methods
to accelerate delivery. Our teams
have the depth and breadth of skills
required, combining significant BCBS
239, data, risk management and IT
change experience.
11. Contacts
Dan Higgins
EMEIA IT Advisory Leader
Email: dan.higgins@ey.com
Tel: + 44 20 795 14788
Neil Thewlis
EMEIA BCBS 239 Leader
Email: nthewlis@uk.ey.com
Tel: + 44 20 795 16019
Richard Powell
UK BCBS 239 Leader
Email: rpowell@uk.ey.com
Tel: + 44 20 795 10817
Jared Chebib
Risk Advisory
Email: jchebib@uk.ey.com
Tel: + 44 20 795 15941
Rob Toguri
EMEIA Enterprise
Intelligence Leader
Email: rtoguri@uk.ey.com
Tel: + 44 20 795 12470