- Credit Suisse is a global financial services company providing banking services to companies, institutional clients, high-net-worth individuals, and retail clients in Switzerland. It has over 48,000 employees across over 50 countries.
- Reference data is foundational data used across business transactions, such as client, product, and legal entity data. Consistent reference data is important for accurate reporting and analysis. However, Credit Suisse currently faces challenges of inconsistent views of reference data across applications.
- Credit Suisse's vision is to implement a multi-domain reference data management strategy using a central platform to provide consistent, validated reference data across the organization and reduce complexity.
Reference data is something we often encounter in our projects. In our experience, it is often underestimated and does not get enough attention. In the webinar, we want to make you aware of some interesting aspects of ‘reference data’ such as how it relates to MDM, which it’s often mixed with.
There’s growing recognition in the analyst community that reference data is a form of master data that requires its own governance. Locations, currency codes, financial accounts, and organizational hierarchies are so widely used in an organization that mismatches can result in: reconciliation issues, poor quality analytics or even transactional failures.
While it’s easy to see how poor reference data management (RDM) can cause problems, many companies struggle with determining how to get started. Multiple questions arise: What’s the scope? How should one choose between RDM solutions? How do I compute ROI? To answer these questions and more, Orchestra Networks teamed up with Aaron Zornes, Chief Research Office of the MDM Institute and Godfather of MDM, for: Everything you ever wanted to know about Reference Data (but were afraid to ask).
In this hour long webcast featuring Aaron Zornes (MDM Institute) and Conrad Chuang (Orchestra Networks) you will learn the:
Characteristics of reference data,
Key features of a reference data management (RDM) solution,
Lessons learned RDM implementations,
and more
Activate Data Governance Using the Data CatalogDATAVERSITY
Data Governance programs depend on the activation of data stewards that are held formally accountable for how they manage data. The data catalog is a critical tool to enable your stewards to contribute and interact with an inventory of metadata about the data definition, production, and usage. This interaction is active Data Governance in the truest sense of the word.
In this RWDG webinar, Bob Seiner will share tips and techniques focused on activating your data stewards through a data catalog. Data Governance programs that involve stewards in daily activities are more likely to demonstrate value from their data-intensive investments.
Bob will address the following in this webinar:
- A comparison of active and passive Data Governance
- What it means to have an active Data Governance program
- How a data catalog tool can be used to activate data stewards
- The role a data catalog plays in Data Governance
- The metadata in the data catalog will not govern itself
Reference matter data management:
Two categories of structured data :
Master data: is data associated with core business entities such as customer, product, asset, etc.
Transaction data: is the recording of business transactions such as orders in manufacturing, loan and credit card payments in banking, and product sales in retail.
Reference data: is any kind of data that is used solely to categorize other data found in a database, or solely for relating data in a database to information beyond the boundaries of the enterprise .
Strategic Business Requirements for Master Data Management SystemsBoris Otto
This presentation describes strategic business requirements of master data management (MDM) systems. The requirements were developed in a consortium research approach by the Institute of Information Management at the University of St. Gallen, Switzerland, and 20 multinational enterprises.
The presentation was given at the 17th Amercias Conference on Information Systems (AMCIS 2011) in Detroit, MI.
The research paper on which this presentation is based on can be found here: http://www.alexandria.unisg.ch/Publikationen/Zitation/Boris_Otto/177697
Reference data is something we often encounter in our projects. In our experience, it is often underestimated and does not get enough attention. In the webinar, we want to make you aware of some interesting aspects of ‘reference data’ such as how it relates to MDM, which it’s often mixed with.
There’s growing recognition in the analyst community that reference data is a form of master data that requires its own governance. Locations, currency codes, financial accounts, and organizational hierarchies are so widely used in an organization that mismatches can result in: reconciliation issues, poor quality analytics or even transactional failures.
While it’s easy to see how poor reference data management (RDM) can cause problems, many companies struggle with determining how to get started. Multiple questions arise: What’s the scope? How should one choose between RDM solutions? How do I compute ROI? To answer these questions and more, Orchestra Networks teamed up with Aaron Zornes, Chief Research Office of the MDM Institute and Godfather of MDM, for: Everything you ever wanted to know about Reference Data (but were afraid to ask).
In this hour long webcast featuring Aaron Zornes (MDM Institute) and Conrad Chuang (Orchestra Networks) you will learn the:
Characteristics of reference data,
Key features of a reference data management (RDM) solution,
Lessons learned RDM implementations,
and more
Activate Data Governance Using the Data CatalogDATAVERSITY
Data Governance programs depend on the activation of data stewards that are held formally accountable for how they manage data. The data catalog is a critical tool to enable your stewards to contribute and interact with an inventory of metadata about the data definition, production, and usage. This interaction is active Data Governance in the truest sense of the word.
In this RWDG webinar, Bob Seiner will share tips and techniques focused on activating your data stewards through a data catalog. Data Governance programs that involve stewards in daily activities are more likely to demonstrate value from their data-intensive investments.
Bob will address the following in this webinar:
- A comparison of active and passive Data Governance
- What it means to have an active Data Governance program
- How a data catalog tool can be used to activate data stewards
- The role a data catalog plays in Data Governance
- The metadata in the data catalog will not govern itself
Reference matter data management:
Two categories of structured data :
Master data: is data associated with core business entities such as customer, product, asset, etc.
Transaction data: is the recording of business transactions such as orders in manufacturing, loan and credit card payments in banking, and product sales in retail.
Reference data: is any kind of data that is used solely to categorize other data found in a database, or solely for relating data in a database to information beyond the boundaries of the enterprise .
Strategic Business Requirements for Master Data Management SystemsBoris Otto
This presentation describes strategic business requirements of master data management (MDM) systems. The requirements were developed in a consortium research approach by the Institute of Information Management at the University of St. Gallen, Switzerland, and 20 multinational enterprises.
The presentation was given at the 17th Amercias Conference on Information Systems (AMCIS 2011) in Detroit, MI.
The research paper on which this presentation is based on can be found here: http://www.alexandria.unisg.ch/Publikationen/Zitation/Boris_Otto/177697
Overcoming the Challenges of your Master Data Management JourneyJean-Michel Franco
This Presentaion runs you through all the key steps of an MDM initiative. It considers and showcase the key milestones and building blocks that you will have to roll-out to make your MDM
journey
-> Please contact Talend for a dedicated interactive sessions with a storyboard by customer domain
To take a “ready, aim, fire” tactic to implement Data Governance, many organizations assess themselves against industry best practices. The process is not difficult or time-consuming and can directly assure that your activities target your specific needs. Best practices are always a strong place to start.
Join Bob Seiner for this popular RWDG topic, where he will provide the information you need to set your program in the best possible direction. Bob will walk you through the steps of conducting an assessment and share with you a set of typical results from taking this action. You may be surprised at how easy it is to organize the assessment and may hear results that stimulate the actions that you need to take.
In this webinar, Bob will share:
- The value of performing a Data Governance best practice assessment
- A practical list of industry Data Governance best practices
- Criteria to determine if a practice is best practice
- Steps to follow to complete an assessment
- Typical recommendations and actions that result from an assessment
MDM Institute: Why is Reference data mission critical now?Orchestra Networks
Learn why market-leading enterprises are focusing on RDM in this exclusive webinar from MDM research analyst Aaron Zornes
More than 55% of large enterprises surveyed by the MDM Institute are planning on implementing reference data management (RDM) in the next 18 months.
Why is RDM mission critical today?
How does RDM differ from (how is it similar to) MDM?
What are the top business drivers for RDM?
What are the “top 10” technical evaluation criteria?
Where are most organizations focusing their RDM efforts?
Aaron Zornes, Chief Research Officer of the MDM Institute, answers these questions and more when he reveals findings from the first ever RDM market study based on a 1Q2014 survey of 75+ global 5000 size enterprises.
Webinar: How Banks Manage Reference Data with MongoDBMongoDB
Managing and distributing reference data globally has always been a challenge for financial institutions. Managing and maintaining database schemas while integrating and replicating that data across geographies is costly and time consuming. MongoDB's native replication capabilities and partitioned architecture make it simple to distribute and synchronize data efficiently across the globe. MongoDB’s dynamic schema dramatically reduces database maintenance for schema migrations – data structure changes can be applied with no down time, and with no impact to existing applications.
Data-Ed Slides: Best Practices in Data Stewardship (Technical)DATAVERSITY
In order to find value in your organization's data assets, heroic data stewards are tasked with saving the day- every single day! These heroes adhere to a data governance framework and work to ensure that data is: captured right the first time, validated through automated means, and integrated into business processes. Whether its data profiling or in depth root cause analysis, data stewards can be counted on to ensure the organization's mission critical data is reliable. In this webinar we will approach this framework, and punctuate important facets of a data steward’s role.
Learning Objectives:
- Understand the business need for a data governance framework
- Learn why embedded data quality principles are an important part of system/process design
- Identify opportunities to help drive your organization to a data driven culture
Emerging Trends in Data Architecture – What’s the Next Big ThingDATAVERSITY
Digital Transformation is a top priority for many organizations, and a successful digital journey requires a strong data foundation. Creating this digital transformation requires a number of core data management capabilities such as MDM, With technological innovation and change occurring at an ever-increasing rate, it’s hard to keep track of what’s hype and what can provide practical value for your organization. Join this webinar to see the results of a recent DATAVERSITY survey on emerging trends in Data Architecture, along with practical commentary and advice from industry expert Donna Burbank.
The first step towards understanding data assets’ impact on your organization is understanding what those assets mean for each other. Metadata — literally, data about data — is a practice area required by good systems development, and yet is also perhaps the most mislabeled and misunderstood Data Management practice. Understanding metadata and its associated technologies as more than just straightforward technological tools can provide powerful insight into the efficiency of organizational practices, and enable you to combine practices into sophisticated techniques, supporting larger and more complex business initiatives. Program learning objectives include:
* Understanding how to leverage metadata practices in support of business strategy
* Discuss foundational metadata concepts
* Guiding principles for and lessons previously learned from metadata and its practical uses applied strategy
* Understanding how to leverage metadata practices in support of business strategy
* Metadata strategies, including:
* Metadata is a gerund so don’t try to treat it as a noun
* Metadata is the language of Data Governance
* Treat glossaries/repositories as capabilities, not technology
In this lecture we discuss data quality and data quality in Linked Data. This 50 minute lecture was given to masters student at Trinity College Dublin (Ireland), and had the following contents:
1) Defining Quality
2) Defining Data Quality - What, Why, Costs
3) Identifying problems early - using a simple semantic publishing process as an example
4) Assessing Linked (big) Data quality
5) Quality of LOD cloud datasets
References can be found at the end of the slides
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 (CC-BY-SA-40) International License.
Data Architecture, Solution Architecture, Platform Architecture — What’s the ...DATAVERSITY
A solid data architecture is critical to the success of any data initiative. But what is meant by “data architecture”? Throughout the industry, there are many different “flavors” of data architecture, each with its own unique value and use cases for describing key aspects of the data landscape. Join this webinar to demystify the various architecture styles and understand how they can add value to your organization.
How to Build & Sustain a Data Governance Operating Model DATUM LLC
Learn how to execute a data governance strategy through creation of a successful business case and operating model.
Originally presented to an audience of 400+ at the Master Data Management & Data Governance Summit.
Visit www.datumstrategy.com for more!
Data Architecture Best Practices for Advanced AnalyticsDATAVERSITY
Many organizations are immature when it comes to data and analytics use. The answer lies in delivering a greater level of insight from data, straight to the point of need.
There are so many Data Architecture best practices today, accumulated from years of practice. In this webinar, William will look at some Data Architecture best practices that he believes have emerged in the past two years and are not worked into many enterprise data programs yet. These are keepers and will be required to move towards, by one means or another, so it’s best to mindfully work them into the environment.
Master Data Management – Aligning Data, Process, and GovernanceDATAVERSITY
Master Data Management (MDM) provides organizations with an accurate and comprehensive view of their business-critical data such as customers, products, vendors, and more. While mastering these key data areas can be a complex task, the value of doing so can be tremendous – from real-time operational integration to data warehousing and analytic reporting. This webinar will provide practical strategies for gaining value from your MDM initiative, while at the same time assuring a solid architectural and governance foundation that will ensure long-term, enterprise-wide success.
Good data is like good water: best served fresh, and ideally well-filtered. Data Management strategies can produce tremendous procedural improvements and increased profit margins across the board, but only if the data being managed is of a high quality. Determining how Data Quality should be engineered provides a useful framework for utilizing Data Quality management effectively in support of business strategy, which in turns allows for speedy identification of business problems, delineation between structural and practice-oriented defects in Data Management, and proactive prevention of future issues.
Over the course of this webinar, we will:
Help you understand foundational Data Quality concepts based on “The DAMA Guide to the Data Management Body of Knowledge” (DAMA DMBOK), as well as guiding principles, best practices, and steps for improving Data Quality at your organization
Demonstrate how chronic business challenges for organizations are often rooted in poor Data Quality
Share case studies illustrating the hallmarks and benefits of Data Quality success
You Need a Data Catalog. Do You Know Why?Precisely
The data catalog has become a popular discussion topic within data management and data governance circles. A data catalog is a central repository that contains metadata for describing data sets, how they are defined, and where to find them. TDWI research indicates that implementing a data catalog is a top priority among organizations we survey. The data catalog can also play an important part in the governance process. It provides features that help ensure data quality, compliance, and that trusted data is used for analysis. Without an in-depth knowledge of data and associated metadata, organizations cannot truly safeguard and govern their data.
Join this on-demand webinar to learn more about the data catalog and its role in data governance efforts.
Topics include:
· Data management challenges and priorities
· The modern data catalog – what it is and why it is important
· The role of the modern data catalog in your data quality and governance programs
· The kinds of information that should be in your data catalog and why
How to Implement Data Governance Best PracticeDATAVERSITY
Data Governance Best Practice is defined as basis and guidelines for suggested governing activities. Organizations define best practices to be used as a point of comparison when determining their readiness, willingness and actions necessary to put a Data Governance program in place. But what are the best practices and how can they be implemented? This webinar will address these questions and more.
In this RWDG webinar, Bob Seiner will talk about how to create, validate, assess and implement Data Governance Best Practice with immediate impact on present and future Data Governance activities. The result of a Best Practice assessment is a thorough actionable plan focused on demonstrating value from your Data Governance program.This webinar will cover:
• Two Criteria for Data Governance Best Practice Development
• How to Assess against Best Practice to Build Program Success
• Examples of Industry Selected DG Best Practice
• How to Communicate DG Best Practice in a Non-Threatening Way
• How to Build DG Best Practice into Daily Operations
Gartner: Master Data Management FunctionalityGartner
Gartner will further examine key trends shaping the future MDM market during the Gartner MDM Summit 2011, 2-3 February in London. More information at www.europe.gartner.com/mdm
Increasing Agility Through Data VirtualizationDenodo
During the Data Summit Conference in New York, our CMO Ravi Shankar and BJ Fesq, Chief Data Officer at CIT Group, were discussing the modernization of data architectures with data virtualization.
This presentation explores how data virtualization is being used to dramatically reduce data proliferation and ensure that all consumers are working with a single source of the truth. It also looks at how data virtualization can drive standardization, measure and improve data quality, abstract data consumers from data providers, expose data lineage, enable cross-company data integration, and serve as a common provisioning point from which to access all authoritative sources of data.
Overcoming the Challenges of your Master Data Management JourneyJean-Michel Franco
This Presentaion runs you through all the key steps of an MDM initiative. It considers and showcase the key milestones and building blocks that you will have to roll-out to make your MDM
journey
-> Please contact Talend for a dedicated interactive sessions with a storyboard by customer domain
To take a “ready, aim, fire” tactic to implement Data Governance, many organizations assess themselves against industry best practices. The process is not difficult or time-consuming and can directly assure that your activities target your specific needs. Best practices are always a strong place to start.
Join Bob Seiner for this popular RWDG topic, where he will provide the information you need to set your program in the best possible direction. Bob will walk you through the steps of conducting an assessment and share with you a set of typical results from taking this action. You may be surprised at how easy it is to organize the assessment and may hear results that stimulate the actions that you need to take.
In this webinar, Bob will share:
- The value of performing a Data Governance best practice assessment
- A practical list of industry Data Governance best practices
- Criteria to determine if a practice is best practice
- Steps to follow to complete an assessment
- Typical recommendations and actions that result from an assessment
MDM Institute: Why is Reference data mission critical now?Orchestra Networks
Learn why market-leading enterprises are focusing on RDM in this exclusive webinar from MDM research analyst Aaron Zornes
More than 55% of large enterprises surveyed by the MDM Institute are planning on implementing reference data management (RDM) in the next 18 months.
Why is RDM mission critical today?
How does RDM differ from (how is it similar to) MDM?
What are the top business drivers for RDM?
What are the “top 10” technical evaluation criteria?
Where are most organizations focusing their RDM efforts?
Aaron Zornes, Chief Research Officer of the MDM Institute, answers these questions and more when he reveals findings from the first ever RDM market study based on a 1Q2014 survey of 75+ global 5000 size enterprises.
Webinar: How Banks Manage Reference Data with MongoDBMongoDB
Managing and distributing reference data globally has always been a challenge for financial institutions. Managing and maintaining database schemas while integrating and replicating that data across geographies is costly and time consuming. MongoDB's native replication capabilities and partitioned architecture make it simple to distribute and synchronize data efficiently across the globe. MongoDB’s dynamic schema dramatically reduces database maintenance for schema migrations – data structure changes can be applied with no down time, and with no impact to existing applications.
Data-Ed Slides: Best Practices in Data Stewardship (Technical)DATAVERSITY
In order to find value in your organization's data assets, heroic data stewards are tasked with saving the day- every single day! These heroes adhere to a data governance framework and work to ensure that data is: captured right the first time, validated through automated means, and integrated into business processes. Whether its data profiling or in depth root cause analysis, data stewards can be counted on to ensure the organization's mission critical data is reliable. In this webinar we will approach this framework, and punctuate important facets of a data steward’s role.
Learning Objectives:
- Understand the business need for a data governance framework
- Learn why embedded data quality principles are an important part of system/process design
- Identify opportunities to help drive your organization to a data driven culture
Emerging Trends in Data Architecture – What’s the Next Big ThingDATAVERSITY
Digital Transformation is a top priority for many organizations, and a successful digital journey requires a strong data foundation. Creating this digital transformation requires a number of core data management capabilities such as MDM, With technological innovation and change occurring at an ever-increasing rate, it’s hard to keep track of what’s hype and what can provide practical value for your organization. Join this webinar to see the results of a recent DATAVERSITY survey on emerging trends in Data Architecture, along with practical commentary and advice from industry expert Donna Burbank.
The first step towards understanding data assets’ impact on your organization is understanding what those assets mean for each other. Metadata — literally, data about data — is a practice area required by good systems development, and yet is also perhaps the most mislabeled and misunderstood Data Management practice. Understanding metadata and its associated technologies as more than just straightforward technological tools can provide powerful insight into the efficiency of organizational practices, and enable you to combine practices into sophisticated techniques, supporting larger and more complex business initiatives. Program learning objectives include:
* Understanding how to leverage metadata practices in support of business strategy
* Discuss foundational metadata concepts
* Guiding principles for and lessons previously learned from metadata and its practical uses applied strategy
* Understanding how to leverage metadata practices in support of business strategy
* Metadata strategies, including:
* Metadata is a gerund so don’t try to treat it as a noun
* Metadata is the language of Data Governance
* Treat glossaries/repositories as capabilities, not technology
In this lecture we discuss data quality and data quality in Linked Data. This 50 minute lecture was given to masters student at Trinity College Dublin (Ireland), and had the following contents:
1) Defining Quality
2) Defining Data Quality - What, Why, Costs
3) Identifying problems early - using a simple semantic publishing process as an example
4) Assessing Linked (big) Data quality
5) Quality of LOD cloud datasets
References can be found at the end of the slides
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 (CC-BY-SA-40) International License.
Data Architecture, Solution Architecture, Platform Architecture — What’s the ...DATAVERSITY
A solid data architecture is critical to the success of any data initiative. But what is meant by “data architecture”? Throughout the industry, there are many different “flavors” of data architecture, each with its own unique value and use cases for describing key aspects of the data landscape. Join this webinar to demystify the various architecture styles and understand how they can add value to your organization.
How to Build & Sustain a Data Governance Operating Model DATUM LLC
Learn how to execute a data governance strategy through creation of a successful business case and operating model.
Originally presented to an audience of 400+ at the Master Data Management & Data Governance Summit.
Visit www.datumstrategy.com for more!
Data Architecture Best Practices for Advanced AnalyticsDATAVERSITY
Many organizations are immature when it comes to data and analytics use. The answer lies in delivering a greater level of insight from data, straight to the point of need.
There are so many Data Architecture best practices today, accumulated from years of practice. In this webinar, William will look at some Data Architecture best practices that he believes have emerged in the past two years and are not worked into many enterprise data programs yet. These are keepers and will be required to move towards, by one means or another, so it’s best to mindfully work them into the environment.
Master Data Management – Aligning Data, Process, and GovernanceDATAVERSITY
Master Data Management (MDM) provides organizations with an accurate and comprehensive view of their business-critical data such as customers, products, vendors, and more. While mastering these key data areas can be a complex task, the value of doing so can be tremendous – from real-time operational integration to data warehousing and analytic reporting. This webinar will provide practical strategies for gaining value from your MDM initiative, while at the same time assuring a solid architectural and governance foundation that will ensure long-term, enterprise-wide success.
Good data is like good water: best served fresh, and ideally well-filtered. Data Management strategies can produce tremendous procedural improvements and increased profit margins across the board, but only if the data being managed is of a high quality. Determining how Data Quality should be engineered provides a useful framework for utilizing Data Quality management effectively in support of business strategy, which in turns allows for speedy identification of business problems, delineation between structural and practice-oriented defects in Data Management, and proactive prevention of future issues.
Over the course of this webinar, we will:
Help you understand foundational Data Quality concepts based on “The DAMA Guide to the Data Management Body of Knowledge” (DAMA DMBOK), as well as guiding principles, best practices, and steps for improving Data Quality at your organization
Demonstrate how chronic business challenges for organizations are often rooted in poor Data Quality
Share case studies illustrating the hallmarks and benefits of Data Quality success
You Need a Data Catalog. Do You Know Why?Precisely
The data catalog has become a popular discussion topic within data management and data governance circles. A data catalog is a central repository that contains metadata for describing data sets, how they are defined, and where to find them. TDWI research indicates that implementing a data catalog is a top priority among organizations we survey. The data catalog can also play an important part in the governance process. It provides features that help ensure data quality, compliance, and that trusted data is used for analysis. Without an in-depth knowledge of data and associated metadata, organizations cannot truly safeguard and govern their data.
Join this on-demand webinar to learn more about the data catalog and its role in data governance efforts.
Topics include:
· Data management challenges and priorities
· The modern data catalog – what it is and why it is important
· The role of the modern data catalog in your data quality and governance programs
· The kinds of information that should be in your data catalog and why
How to Implement Data Governance Best PracticeDATAVERSITY
Data Governance Best Practice is defined as basis and guidelines for suggested governing activities. Organizations define best practices to be used as a point of comparison when determining their readiness, willingness and actions necessary to put a Data Governance program in place. But what are the best practices and how can they be implemented? This webinar will address these questions and more.
In this RWDG webinar, Bob Seiner will talk about how to create, validate, assess and implement Data Governance Best Practice with immediate impact on present and future Data Governance activities. The result of a Best Practice assessment is a thorough actionable plan focused on demonstrating value from your Data Governance program.This webinar will cover:
• Two Criteria for Data Governance Best Practice Development
• How to Assess against Best Practice to Build Program Success
• Examples of Industry Selected DG Best Practice
• How to Communicate DG Best Practice in a Non-Threatening Way
• How to Build DG Best Practice into Daily Operations
Gartner: Master Data Management FunctionalityGartner
Gartner will further examine key trends shaping the future MDM market during the Gartner MDM Summit 2011, 2-3 February in London. More information at www.europe.gartner.com/mdm
Increasing Agility Through Data VirtualizationDenodo
During the Data Summit Conference in New York, our CMO Ravi Shankar and BJ Fesq, Chief Data Officer at CIT Group, were discussing the modernization of data architectures with data virtualization.
This presentation explores how data virtualization is being used to dramatically reduce data proliferation and ensure that all consumers are working with a single source of the truth. It also looks at how data virtualization can drive standardization, measure and improve data quality, abstract data consumers from data providers, expose data lineage, enable cross-company data integration, and serve as a common provisioning point from which to access all authoritative sources of data.
GDPR Noncompliance: Avoid the Risk with Data VirtualizationDenodo
You can watch the full webinar on-demand here: https://goo.gl/2f2RYF
In its recent report “Predictions 2018: A year of reckoning”, Forrester predicts that 80% of firms affected by GDPR will not comply with the regulation by May 2018. Of those noncompliant firms, 50% will intentionally not comply.
Compliance doesn’t have to be this difficult! What if you have an opportunity to facilitate GDPR compliance with a mature technology and significant cost reduction? Data virtualization is a mature, cost-effective technology that enables privacy by design to facilitate GDPR compliance.
Attend this session to learn:
• How data virtualization provides a GDPR compliance foundation with data catalog, auditing, and data security.
• How you can enable single enterprise-wide data access layer with guardrails.
• Why data virtualization is a must-have capability for compliance use cases.
• How Denodo’s customers have facilitated compliance.
Bridging Data Gaps with a Solid Data Foundation - A Key Imperative for Today’...Denodo
Watch full webinar here: https://bit.ly/3CjoaxS
In this session, the panel will discuss the importance of laying out a solid data foundation for everything digital for any financial institution. The panelists from UFCU and DevFacto will share their journey and agile approach toward data management in a hybrid data environment.
From this session, you will learn how UFCU gained unprecedented agility in data management and built the foundation for a “member 360” view. Devfacto worked with UFCU to design and set up multiple service streams. To streamline cloud adoption, and seamlessly unify cloud and on-premise data sources. Denodo’s Logical Data Platform enabled UFCU with reusable Lego-like building blocks to create different data views for business teams.
KASHTECH AND DENODO: ROI and Economic Value of Data VirtualizationDenodo
Watch full webinar here: https://bit.ly/3sumuL5
Join KashTech and Denodo to discover how Data Virtualization can help accelerate your time-to-value from data while reducing the costs at the same time.
Gartner has predicted that organizations using Data Virtualization will spend 40% less on data integration than those using traditional technologies. Denodo customers have experienced time-to-deliver improvements of up to 90% within their data provisioning processes and cost savings of 50% or more. As Rod Tidwell (Cuba Gooding Jr.) said in the movie 'Jerry Maguire', "Show me the money!"
Register to attend and learn how Data Virtualization can:
- Accelerate the delivery of data to users
- Drive digital transformation initiatives
- Reduce project costs and timelines
- Quickly deliver value to your organization
How a Logical Data Fabric Enhances the Customer 360 ViewDenodo
Watch full webinar here: https://bit.ly/3GI802M
Organisations have struggled for years in understanding their customers, this has mainly been due to not having the right data available at the right point in time. In this session we will discuss the role of Data Virtualization in providing customer 360 degree view and look at some of the success stories our customers have told us about.
Data Virtualization for Compliance – Creating a Controlled Data EnvironmentDenodo
CIT modernized its data architecture in response to intense regulatory scrutiny. In this presentation, they present how data virtualization is being used to drive standardization, enable cross-company data integration, and serve as a common provisioning point from which to access all authoritative sources of data.
This presentation is part of the Fast Data Strategy Conference, and you can watch the video here goo.gl/CCqUeT.
Do you lose precious time due to data quality problems?
Do you need to integrate data from multiples sources and provide an integrated view of your customer or product attributes to other systems?
SQL Server 2016 Data Quality and Master Data Services can help you.
Education Seminar: Self-service BI, Logical Data Warehouse and Data LakesDenodo
This educational seminar took place on Thursday, December 8th in Westin Galleria Dallas, Texas.
Self-service BI, Logical Data Warehouse and Data Lakes – They are all essential components of Fast Data Strategy. Many companies are rapidly augmenting their traditional data warehouses, data marts, and ETL with their logical counterparts. Reason? Agility and rapid time-to-market.
Speakers including:
• Chuck DeVries, VP, Strategic Technology and Enterprise Architecture, Vizient,
• Ravi Shankar, Chief Marketing Officer, Denodo
• Charles Yorek, Vice President, iOLAP
Modernizing Integration with Data VirtualizationDenodo
Watch full webinar here: https://bit.ly/3CMqS0E
Today, businesses have more data and data types combined with more complex ecosystems than they have ever had before. Examples include on-premise data marts, data warehouses, data lakes, applications, spreadsheets, IoT data, sensor data, unstructured, etc. combined with cloud data ecosystems like Snowflake, Big Query, Azure Synapse, Amazon S3, Redshift, Databricks, SaaS apps, such as Salesforce, Oracle, Service Now, Workday, and on and on.
Data, Analytics, Data Science and Architecture teams are struggling to provide the business users with the right data as quickly and efficiently as possible to quickly enable Analytics, Dashboards, BI, Reports, etc. Unfortunately, many enterprises seek to meet this pressing need by utilizing antiquated and legacy 40+ year-old approaches. There is a better way. Proven by thousands of other companies.
As Forrester so astutely reported in their recent Total Economic Impact Study, companies who employed Data Virtualization reported a “65% decrease in data delivery times over ETL” and an “83% reduction in time to new revenue.”
Join us for this very educational webinar to learn firsthand from Denodo Technologies and Fusion Alliance how:
- Data Virtualization helps your company save time and money by eliminating superfluous ETL pipelines and data replication.
- Data Virtualization can become the cornerstone of your modern data approach to deliver data faster and more efficiently than old legacy approaches at enterprise scale.
- How quickly and easily, Data Virtualization can scale, even in the most complex environments, to create a universal abstraction semantic model(s) for all of your cloud, on premise, structured, unstructured and hybrid data
- Data Mesh and Data Fabric architecture patterns for maximum reuse
- Other customers have used, and are using, Data Virtualization to tackle their toughest data integration and data delivery challenges
- Fusion Alliance can help you define a data strategy tailored to your organization’s needs and requirements, and how they can help you achieve success and enable your business with self-service capabilities
This is a slide deck that was assembled as a result of months of Project work at a Global Multinational. Collaboration with some incredibly smart people resulted in content that I wish I had come across prior to having to have assembled this.
CRM-UG Summit Phoenix 2018 - What is Common Data Model and how to use it?Nicolas Georgeault
My Slidedeck about Common Data Service and Model from CRMUG SUmmit in Phoenix Oct 2018. This technology is under development so content is subject to change and based on current service on 10/18/2018
Webinar #2 - Transforming Challenges into Opportunities for Credit UnionsDenodo
Watch full webinar here: https://buff.ly/3vhzqL5
Join our exclusive webinar series designed to empower credit unions with transformative insights into the untapped potential of data. Explore how data can be a strategic asset, enabling credit unions to overcome challenges and foster substantial growth.
This webinar will delve into how data can serve as a catalyst for addressing key challenges faced by credit unions, propelling them towards a future of enhanced efficiency and growth.
The Oil and Gas industry is evolving rapidly. This is why SBM Offshore has launched an enterprise-wide program to redefine its way of working. In this presentation René Meijers, the Head of Data and Information Management at SBM Offshore, will provide an overview of their entire multidomain MDM program.
Presentation at Master Data Management 2015, Helsinki, FINLAND
VAASAN product master data consolidation
- Master data challenges at VAASAN
- Pre-study and tool selection - why we chose EBX5
- Global solution presentation
- Achieved Benefits and lessons learned
Speaker: Natalia Kopeykin, Service Manager, Business Support systems, VAASAN Group
MDM & RDM: Enabling a One Company Supply Chain in a Decentralized EnvironmentOrchestra Networks
Presented @ MDM/DG Summit NYC 2015 (Oct 6, 2015)
In this presentation Lydia Tilsley (UTC Operations) and Larry Keyser (UTCHQ IT) from the United Technologies Corporation (UTC) describe how reference and master data management is being used to support UTC's "One Supply Chain" initiatives at UTC.
In this case study, hear how The United Technologies Corporation, a globally distributed, Fortune 500 company manages their Oracle Hyperion EPM metadata using Orchestra Networks’ cost-effective, Oracle DRM alternative: EBX5. Also learn how UTC is creating much more value from their Oracle EPM applications by sharing dimensions (such as entities) with their entire finance application ecosystem. The goal, consistency across all financial applications and hierarchies for controls, tax, tax provision and more. The session will also cover practical issues: data exchange with HFM, governance, workflow and hierarchy management.
Sabre is a technology solutions provider to the global travel and tourism industry, encompassing four business units: Sabre Airlines Solutions, Sabre Travel Network, Sabre Hospitality Solutions and Travelocity. Sabre provides software to travel agencies, corporations, travelers, airlines, hotels, rental car, rail, cruise and tour operator companies. Divisions within each of these groups also service the business or corporate travel market. Sabre grew out of American Airlines and was spun off with an IPO in 2000 and currently employs approximately 10,000 people in 60 countries. In addition to managing the business processes and reporting across the four divisions, the IT group has been tasked to provide an agile architecture to accommodate M&A opportunities in the hospitality industry. Clearly, one of the biggest opportunities for leverage of corporate information assets is travel-related “public” and “private” reference data. Critical to the launch of such a program is to answer the key question “Why after all this time do we need RDM?” This session will provide insights and best practices concerning the establishment of an enterprise RDM program in a large global enterprise by discussing topics such as:
– Establishing the business value of an enterprise RDM program (“Hello, Houston … we have a problem”)
– Overcoming the cultural & territorial obstacles by selling change as a compelling argument for RDM (“Shift Happens”)
– Futureproofing the enterprise RDM program solution, outcome & direction (“What we didn’t think about”)
Mastering Oracle® Hyperion EPM Metadata in a distributed organizationOrchestra Networks
In this case study from Kscope14, hear how The United Technologies Corporation, a globally distributed, Fortune 500 company manages their Oracle Hyperion EPM metadata using Orchestra Networks’ cost-effective, DRM alternative: MDM for Oracle Hyperion EPM. The session covered practical issues: data exchange with HFM, governance, workflow. Also discussed, how other alternatives compare and their key differences.
Acolyance (€522M+revenues) is a leading French agriculture and wine cooperative serving a network of 3,500 agriculture members and 7,000 wine producers. Management is very focused on innovation and is preparing a large-scale transformation for 2014 wherein ERP will be deployed to support most of business processes (finance, accounting, harvest, retailing, procurement, sales, …). Unlike traditional viewpoints with ERP systems as the solution to manage all master data within their perimeter because they were intended to be *the* company core system, Acolyance has decided to make its master data program a prerequisite of its ERP implementation. By having an MDM approach synchronized with the ERP strategy, Acolyance is convinced that ERP will be able to concentrate on its core business processes and to deliver quicker and better. Additionally, MDM enlarges the ERP scope by facilitating collaboration with trading partners. In this session, topics to be discussed include:
- Applying MDM as a key approach to secure ERP implementation projects
- Leveraging MDM to fill in functional weaknesses of ERP systems
- Using MDM to facilitate the update cycle of master data that cannot be updated directly in production systems without ERP customization
Accurate BI &MDM Lead to successful Project Execution!Orchestra Networks
McDermott International’s global CIO describes why MDM is vital for accurate reporting, BI and big data analytics.
Presented at Gartner Enterprise Information and Master Data Management Summit, Las Vegas.
McDermott is an engineering and construction company focused on oil and gas field development projects. McDermott PARS (Project Analytics and Reporting System) supports project delivery with reports that integrate information from across the enterprise. At the heart of PARS: an MDM that manages the relationships–between domains, applications and time–required for accurate reporting and analytics.
Driving Multidomain MDM simultaneously to ERP harmonizationOrchestra Networks
Presentation at the Gartner Master Data Management Summit Europe, Barcelona, February 7th, 2013
Learn how Faurecia, a global leader in the automotive industry, delivered a multi- domain Master Data Management program across its business functions. Discover the benefits of MDM on top of a global SAP instance and multiple corporate systems.
UKOUG 2012 Metadata Management for Oracle Hyperion EPMOrchestra Networks
Orchestra Networks presentation at the UK Oracle User Group EPM & Hyperion conference on October 23, 2012. A streamlined approach to the management and governance of shared dimensions and hierarchies - the metadata for the Hyperion EPM Suite.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
2. WWW.MDM.SUMMIT.COM
Credit Suisse Overview
2
Credit Suisse provides companies, institutional clients and high-net-worth
private clients worldwide, as well as retail clients in Switzerland, with
advisory services, comprehensive solutions, and excellent products.
• Active in over 50 countries
• 48,000 + Employees
• Pre tax income: CHF 3.2 Billion (2011)
Organized into:
• Private Banking
• Investment Banking
• Asset Management
3. WWW.MDM.SUMMIT.COM
Reference Data
3
Reference Data
Any foundational data that provides the basis to generate, structure, categorize, or describe
business transactions; and is the basis to view, monitor, analyze and report on these
transactions.
Examples
• Client, Counterparty
• Chart of Accounts
• Booking Codes
• Product
• Legal Entity
• Organization
• Currency
• Calendar
Market Data
While Market Data can be considered a sub-type of Reference Data, it is treated separately
because of its unique low-latency (real time) requirements.
Why is Reference
Data Important?
Reference Data is a core asset of the bank which should be managed and governed in a
systematic fashion. Reference Data impacts most aspects of the banks operations. When
reference data is not used consistently, with commonly understood semantics and sources, it
will lead to multiple points of entry/updates resulting in manual fixes and downstream errors.
Business Imperatives Technology Imperatives
• Take ownership of data and its quality
• Provide information by adding context to data
• Ensure consistent usage across business processes
• Eliminate manual fixes and workarounds
• Meet regulatory requirements
• Transform data into information asset
• Reduce number of point to point interfaces
• Increase re-use using managed interfaces
• Reduce complexity by eliminating complex data flows
• Enable Business to view information instead of data by
providing appropriate tools and technology
• Support Operational Independence
• Provide Multi Entity Capabilities
4. WWW.MDM.SUMMIT.COM
Current Challenges
4
Reference Data
Challenges
• Inconsistent views of reference data used by different applications lead to incorrect &
inconsistent business metrics & reports.
• Multiple sources for a single reference data class (e.g. Counterparty) lead to confusion,
inconsistent representations of reference data.
• Poor understanding of reference data sources leads to multiple systems acting as
reference data enrichment and distribution points, increasing complexity and decreasing
consistency.
• Lack of governance for reference data means no clear ownership and no consistent
quality control processes for many reference data classes.
• Complex data flows and poorly understood data dependencies
Examples
• Different versions of Book codes used within Risk and Finance
• Different Legal Entity hierarchies (out of synch when changes are made)
• Different MIS hierarchies (over 500 versions currently stored)
Reference Data Interfaces Legacy Interfaces to/from Risk and Finance
• PeopleSoft GL is a large provider of reference data
today
• It provides 740 reference data feeds, including
• GL Accounts
• Consolidation Accounts
• Book
• Org Structures, and others
5. WWW.MDM.SUMMIT.COM
Vision: Multi-Domain Reference Data Strategy
5
Vision
To implement a multi-domain reference data management capability that provides
consistent, validated, well-formed and well-governed reference data, for all reference data
domains (classes) owned and managed by Back Office IT.1
Business Value
• Providing accurate, consistent reference data will reduce reporting and analysis errors
caused by incorrect reference data, and will reduce the overall cost of managing and
governing reference data.
IT Architecture
Value
• Significant reduction in the number and complexity of reference data interfaces, and
simplification of application logic as all reference data management functions are
centralized in a reference data hub.
1.Excludes Product and Client reference data
Common Data Model
Ensure a common understanding of our
data and how it should be used. Introduce
a framework to organize our complex data
landscape
Define our data
Central Platform
Central Governance
Make the right data easily accessible
at the right time
Central data governance ensuring clear
ownership and correct usage of the
shared data across the divisions
Control our data
Share our data
Objectives
6. WWW.MDM.SUMMIT.COM
Vision: Future State
6
Future State
High re-usability of data
objects
Use of “true” MDM tools for
reference data lifecycle
management
Reduced investment in
personalized engineered
hardware solutions
Transparent routing and
entitlement
Consistent semantics
Consistent data management
framework
Business Impact
Eliminate Interpretation Risk
High levels of automation
supporting authoring,
stewardship, governance
Consistent user adoption
Lower cost; lower innovation
threshold
Increased data quality
Integrated data
Flexible IT investment
7. WWW.MDM.SUMMIT.COM
RDH as a Shared Component Across Our Architecture
7
Reduce Complexity & Improve Efficiency through use of common technology components across
organizational domains.
Risk
Finance
Corporate Services
Data Warehousing
RDH
• Addresses data quality,
data standards
• Eliminates “resellers” of
reference data
• Offers a single version of
the truth
• Centralizes reference
data functions for lower
cost of ownership
8. WWW.MDM.SUMMIT.COM
Defining Our Data – Reference Data Terminology & Taxonomy
8
Organizational
Structure
Entity
Agreement
Ledger
Economic
Resource
Party
Product/
Service
Subject AreaData Domain
Classification
Codes
Org Unit CS Division MIS Unit Department Regions
Legal Entity
Servicing
Entity
Jurisdiction
Client regulatory Approvals
Standard Settlement
Instructions
Legal
Contracts
Chart of
Accounts
Trading
Book Info
Premises
Counter-
party
Client
Financial
Market
Stock
Exchange
External
Bodies
WorkerVendor
Financial
Instrument
Product
Framework
End of Day
Prices
Corporate
Actions
Issue RestrictionsIndices
Formulas Valuations Currency Reference Rates
Reference Data Classes
Currency
Code
Country
Code
Calendar
Language
Code
Industry
Code
Time
Zones
Locales
Transaction
Types
Instrument Credit rating Credit Suisse Rating
Tax
Category
Master Data
Structural Data
Classification
Data
Organization
Enity (OE)
Terms & Conditions
9. WWW.MDM.SUMMIT.COM
Defining Our Data – Common Data Model
9
Business Glossary Business Object Models
Logical Data Models
Service Data Models
Business Glossary of target design
describing definition, usage,
ownership and data governance
aspects for reference data class
data elements.
Business Object Models
describing relationships and
dependencies.
Logical Data Models
To drive the development of the
Service Data Models.
Service Data Models
for distributing data as a SOA
service to consumers.
10. WWW.MDM.SUMMIT.COM
Control our Data - Governance for Reference Data Management
10
Our Approach
Approach
• Minimum Governance Model defined
• Sourcing
• Definition
• Management
• Distribution
• Data Quality
• If minimum governance is met, approved as a managed interface to Golden Source
Opportunistic
• Use every opportunity to push data governance
• Couple of serious issues related to data quality that was escalated to ExB. Used
this to setup a STC comprising of CFO, CIO and GC and a Governance Board of all
COO’s in Back Office
• Regulatory push to handle contract data as reference data. Used this to include IB
in the Data Governance Board
Focus on Value-Add
• Avoided the pitfall of trying to define organizations and roles (viewed as too academic)
• As long as Minimum Governance Model is implemented, it was good enough, thereby
avoiding lengthy discussions of who should be called what (Data Steward, Data Tsar,
Data Provider, Data Owner, Data Governance, Data Conference etc.,)
11. WWW.MDM.SUMMIT.COM
Share our Data - Target Technology
11
Orchestra Networks
EBX from Orchestra Networks selected as standard tool for managing Structural and
Classification reference data.
• Selected after Gartner vendor short list and RFP process completed Dec. 2011
• Approved by Architecture STC for Structural and Classification data
• Offers configuration-based tool with little to no coding required
• Provides robust support for data governance, with workflow that can be adapted to our
business operating model
• Also selected by Asset Management for their client and product MDM tool
Operational Pilot
• Operational pilot completed in April, 2012
• Gain detailed understanding of production footprint, configuration requirements, time to
market considerations, and integration with other CS tools and platforms.
Broader
Opportunity
• Opportunity exists to leverage this technology investment to support Master Data
management, addressing the challenges of PB and IB
• E.g. managing derivative contract content (IB contract life cycle management
initiative)
• IB Client Data Management program is evaluating Orchestra Networks and assessing
its suitability for their requirements
12. WWW.MDM.SUMMIT.COM
Share our Data - Target Technology
12
Analysis based on Product Risk and Vendor Risk. Product Risk is based on market success of the product
and the maturity of the market. Vendor Risk is based on the reputation and stability of the Vendor
High Risk • No market penetration
• Beta version
• E.g., Oracle Fusion Products
Product Risk
Low Risk • Stable product with very high market
penetration
• Mature market
• E.g., Oracle Database
Medium
Risk
• Stable product with medium market
penetration
• Growth mode
• E.g., Oracle Universal Content Management
High Risk • In conception stage. No Enterprise customers
• Not profitable. No cash flow
• Unknown in the market place
Vendor Risk
Low Risk • Stable company with high revenues and stable balance
sheet
• Well recognized in the market place
Medium
Risk
• Has multiple enterprise customers using the Vendor
• Is profitable with a positive cash flow/Risk of being
acquired
• Recognized by analysts/markets as viable alternative
Product Risk Profile is Medium
Orchestra Network’s EBX product was short listed #1 by Gartner
Vendor Risk Profile is Medium
Used in BNP Paribas and various other banks/industries
Mitigation Mitigation
• Vendor relationship with the competency center to help evolve
the product and future direction
• Ensure single code base is maintained across customers
• Provide references to other clients (already done with Citibank and ANZ)
to increase market share
• Provide visibility to vendor with speaking engagements at conferences
(currently being done)
13. WWW.MDM.SUMMIT.COM
Reference Data Onboarding Strategy (1 of 2)
13
Ref Data Hub
Authoring
Management
Governance
Distribution
Data Stewards
Governance Body
Consuming
Apps
Consuming
Apps
Match/Merge
Authoring
Optional
Ref Data Hub
Authoring
Management
Governance
Distribution
Data
Stewards
Consuming
Apps
Consuming
Apps
IB & PB Ref Data Prgm
BO RDH Prgm
1.Multiple
Reference Data
Sources
(e.g. Client,
Product)
• Multiple sources for the same
reference data class require
(potentially sophisticated) Matching
(de-duplication) and Merging (attribute
survivorship) capability
• Authoring (creating of new instances)
remains with the sources
• Management and governance takes
place in the hub, with optional
feedback loop to the sources of record
• All consuming apps acquire from Ref
Data Hub
2.Authoring
External to RDH
(e.g. Currency,
Industry Codes)
• Ref Data Hub acts as golden source;
source of record is external to RDH
(can be external to CS)
• All authoring and management (e.g.
hierarchy maintenance) performed by
data stewards in source of record
• Ref data is loaded into Ref Data Hub
on a periodic basis
• Governance activities take place in Ref
Data Hub
• All consuming apps acquire from Ref
Data Hub
14. WWW.MDM.SUMMIT.COM
Reference Data Onboarding Strategy (2 of 2)
14
Ref Data Hub
Authoring
Management
Governance
Distribution
Consuming
Apps
Consuming
Apps
One-Time
Load
(Optional)
Ref Data Hub
Authoring
Management
Governance
Distribution
Data
Stewards
Governance Body
Consuming
Apps
Consuming
Apps
Ref Data Hub
Authoring
Management
Governance
Distribution
Data Stewards
Governance Body
3. Simple
Authoring in
RDH
(e.g. GL COA,
Calendar)
• Ref Data Hub acts as source of record
and golden source
• Optional initial data load from external
source
• All authoring and management (e.g.
hierarchy maintenance) performed by
data stewards in Ref Data Hub
• Governance activities take place in Ref
Data Hub
• All consuming apps acquire from Ref
Data Hub
4. Complex
Authoring in
RDH
(e.g. Book)
• Complex management processes (e.g.
complex workflows) require a two-step
onboarding process
• Initially, existing source of record is
used, and ref data is loaded into hub
for governance and distribution
• Later when sophisticated management
processes have been implemented in
Ref Data Hub, it becomes the source
of record, eliminating dependency on
external source.
• All consuming apps acquire from Ref
Data Hub
15. WWW.MDM.SUMMIT.COM
Reference Data Adoption Strategy
15
• The existing Golden Source
systems have a large number of
point-to-point interfaces
• The majority of consumers are
sourcing data from a non-golden
source system which leads to
reduced control over the quality
and timeliness of the delivered
reference data
• Our adoption strategy will first
focus on significantly reducing
existing point-to-point interfaces
and maintenance costs by
migrating inter-domain
consumers directly attached to
the Golden Sources
• As a second step, we are
planning to connect existing
Data Hubs to the RDH. This will
immediately provide high quality
and timely data to a large
number of consumers
CurrentState2012-2013Focus
16. WWW.MDM.SUMMIT.COM
Reference Data Hub – Goals for 2012
16
Initiative Data Classes Description
Corporate
Structural Data
• Worker
• Facilities
• Organization
• Reference data available in RDH
• 2012 focus is on adoption
• 84 consuming systems identified for initial
migration
Strategic Risk
Program
• Book • Reference data available in RHD
• 2012 focus is on adoption
Contract Lifecycle
Management
• Contract Data • Focus is onboarding and adoption
PB Platform
Renewal and MEC
• Language
• Calendar
• Regions
• Division
• Focus is onboarding and adoption
OnePPM • Project Portfolio
• Product Portfolio
• Focus is onboarding and adoption
OneGL • GL Chart of Accounts • Focus is onboarding and adoption
• Locale/Country
• State
• Currency
• Servicing Entity
2012 Goals
• A true horizontal service to provide/consume reference data across BO
IT, eliminating the need for disparate reference data hubs
• Standardized process for deploying Reference Data
• Align with major initiatives/functions to supply required reference data
17. WWW.MDM.SUMMIT.COM
Lessons Learned
17
Governance
Challenges
The challenges of implementing Data Governance
• Top Down
• Getting a dedicated data governance organization has been challenging
• No pushback on the idea but hard to decide who takes responsibility, how to fund
the central group and the business case
• Bottom’s Up
• Standard answer “Everything is working fine”
• Hard to get visibility into manual workaround and fixes being done and relating to
data quality issue
• The cynical response being data governance is hard and selecting a preferred
approach or standard often boils down to making a pragmatic decision between sub
optimal options
• The lack of data governance “maturity” complicated by the demand for “one bank data”
– clear data visibility and accountability between front office and back office
Application
Engineering
Challenges
Defining a clear roadmap for application design change
• Assessing the degree and appetite for change: migrating reference data as a function
of individual applications to leveraging a common component used across our sweet of
applications
• Developing “data adapters” to bridge strategic service data models to legacy point to
point interfaces to manage the risk associated with change
• Establishing the right metrics to measure progress and to drive the business case for
change
Summary
Never let a crisis go to waste
• Regulation is the new factor here – this is a genuine opportunity to change the way
reference data is sourced, managed and distributed