The document provides guidance on gaining executive support for master data management (MDM) projects. It recommends quantifying the hidden costs of bad data, conducting interviews with stakeholders across business units to understand data issues, and analyzing the findings to build a business case that shows the specific financial benefits of implementing MDM. Key steps include identifying stakeholders in IT and business functions, preparing interview questions tailored to different roles, interviewing a wide range of staff, and using the results to quantify savings and improved revenues from reducing data problems.
The document discusses different techniques for building a Customer Data Hub (CDH), including registry, co-existence, and transactional techniques. It outlines the CDH build methodology, including data analysis, defining the data model and business logic, participation models, governance, and deliverables. An example enterprise customer data model is also shown using a hybrid-party model with relationships, hierarchies, and extended attributes.
resentation of use cases of Master Data Management for Customer Data. It presents the business drivers and how Talend platform for MDM can adress them.
Gartner: Master Data Management FunctionalityGartner
MDM solutions require tightly integrated capabilities including data modeling, integration, synchronization, propagation, flexible architecture, granular and packaged services, performance, availability, analysis, information quality management, and security. These capabilities allow organizations to extend data models, integrate and synchronize data in real-time and batch processes across systems, measure ROI and data quality, and securely manage the MDM solution.
Data Modelling 101 half day workshop presented by Chris Bradley at the Enterprise Data and Business Intelligence conference London on November 3rd 2014.
Chris Bradley is a leading independent information strategist.
Contact chris.bradley@dmadvisors.co.uk
Driving Data Intelligence in the Supply Chain Through the Data Catalog at TJXDATAVERSITY
Roles and responsibilities are a critical component of every Data Governance program. Building a set of roles that are practical and that will not interfere with people’s “day jobs” is an important consideration that will influence how well your program is adopted. This tutorial focuses on sharing a proven model guaranteed to represent your organization.
Join Bob Seiner for this lively webinar where he will dissect a complete Operating Model of Roles and Responsibilities that encompasses all levels of the organization. Seiner will detail the roles and describe the most effective way to associate people with the roles. You will walk out of this webinar with a model to apply to your organization.
In this session Bob will share:
- The five levels of Data Governance roles
- A proven Operating Model of Roles and Responsibilities
- How to customize the model to meet your requirements
- Setting appropriate role expectations
- How to operationalize the roles and demonstrate value
3 Keys To Successful Master Data Management - Final PresentationJames Chi
This document discusses keys to successful master data management including process, governance, and architecture. It summarizes a survey finding that while many companies see data as an asset, only around 20% have implemented master data management. Successful MDM requires alignment with business objectives, clear governance models, and comprehensive solution architectures. The document advocates establishing policies, procedures, standards, governance, and tools to create and maintain high-quality shared reference data.
Enabling a Data Mesh Architecture and Data Sharing Culture with DenodoDenodo
Watch full webinar here: https://bit.ly/3wATS7b
Data Mesh presents a new distributed and decentralized paradigm for data management, where autonomous domains expose their own data as "data products" to the rest of the organization. It tries to reduce bottlenecks derived from an excessive dependance on centralized IT teams, and capitalizes on the specialized data knowledge that domain users already have. However, Data Mesh literature leaves the implementation of these ideas very open to each organization.
Watch on-demand and learn:
- Deep dive into the key ideas of Data Mesh
- Understand how Denodo can help you implement a Data Mesh
- Hear directly from a Denodo client, Landsbankinn, their journey from a traditional analytic architecture to Data Mesh using Denodo
Business Intelligence & Data Analytics– An Architected ApproachDATAVERSITY
Business intelligence (BI) and data analytics are increasing in popularity as more organizations are looking to become more data-driven. Many tools have powerful visualization techniques that can create dynamic displays of critical information. To ensure that the data displayed on these visualizations is accurate and timely, a strong Data Architecture is needed. Join this webinar to understand how to create a robust Data Architecture for BI and data analytics that takes both business and technology needs into consideration.
The document discusses different techniques for building a Customer Data Hub (CDH), including registry, co-existence, and transactional techniques. It outlines the CDH build methodology, including data analysis, defining the data model and business logic, participation models, governance, and deliverables. An example enterprise customer data model is also shown using a hybrid-party model with relationships, hierarchies, and extended attributes.
resentation of use cases of Master Data Management for Customer Data. It presents the business drivers and how Talend platform for MDM can adress them.
Gartner: Master Data Management FunctionalityGartner
MDM solutions require tightly integrated capabilities including data modeling, integration, synchronization, propagation, flexible architecture, granular and packaged services, performance, availability, analysis, information quality management, and security. These capabilities allow organizations to extend data models, integrate and synchronize data in real-time and batch processes across systems, measure ROI and data quality, and securely manage the MDM solution.
Data Modelling 101 half day workshop presented by Chris Bradley at the Enterprise Data and Business Intelligence conference London on November 3rd 2014.
Chris Bradley is a leading independent information strategist.
Contact chris.bradley@dmadvisors.co.uk
Driving Data Intelligence in the Supply Chain Through the Data Catalog at TJXDATAVERSITY
Roles and responsibilities are a critical component of every Data Governance program. Building a set of roles that are practical and that will not interfere with people’s “day jobs” is an important consideration that will influence how well your program is adopted. This tutorial focuses on sharing a proven model guaranteed to represent your organization.
Join Bob Seiner for this lively webinar where he will dissect a complete Operating Model of Roles and Responsibilities that encompasses all levels of the organization. Seiner will detail the roles and describe the most effective way to associate people with the roles. You will walk out of this webinar with a model to apply to your organization.
In this session Bob will share:
- The five levels of Data Governance roles
- A proven Operating Model of Roles and Responsibilities
- How to customize the model to meet your requirements
- Setting appropriate role expectations
- How to operationalize the roles and demonstrate value
3 Keys To Successful Master Data Management - Final PresentationJames Chi
This document discusses keys to successful master data management including process, governance, and architecture. It summarizes a survey finding that while many companies see data as an asset, only around 20% have implemented master data management. Successful MDM requires alignment with business objectives, clear governance models, and comprehensive solution architectures. The document advocates establishing policies, procedures, standards, governance, and tools to create and maintain high-quality shared reference data.
Enabling a Data Mesh Architecture and Data Sharing Culture with DenodoDenodo
Watch full webinar here: https://bit.ly/3wATS7b
Data Mesh presents a new distributed and decentralized paradigm for data management, where autonomous domains expose their own data as "data products" to the rest of the organization. It tries to reduce bottlenecks derived from an excessive dependance on centralized IT teams, and capitalizes on the specialized data knowledge that domain users already have. However, Data Mesh literature leaves the implementation of these ideas very open to each organization.
Watch on-demand and learn:
- Deep dive into the key ideas of Data Mesh
- Understand how Denodo can help you implement a Data Mesh
- Hear directly from a Denodo client, Landsbankinn, their journey from a traditional analytic architecture to Data Mesh using Denodo
Business Intelligence & Data Analytics– An Architected ApproachDATAVERSITY
Business intelligence (BI) and data analytics are increasing in popularity as more organizations are looking to become more data-driven. Many tools have powerful visualization techniques that can create dynamic displays of critical information. To ensure that the data displayed on these visualizations is accurate and timely, a strong Data Architecture is needed. Join this webinar to understand how to create a robust Data Architecture for BI and data analytics that takes both business and technology needs into consideration.
Essential Reference and Master Data ManagementDATAVERSITY
Data tends to pile up and can be rendered unusable or obsolete without careful maintenance processes. Reference and Master Data Management (MDM) has been a popular Data Management approach to effectively gain mastery over not just the data but the supporting architecture for processing it. This webinar presents MDM as a strategic approach to improving and formalizing practices around those data items that provide context for many organizational transactions: its master data. Too often, MDM has been implemented technology-first and achieved the same very poor track record (one-third succeeding on-time, within budget, and achieving planned functionality). MDM success depends on a coordinated approach typically involving Data Governance and Data Quality activities.
Learning objectives:
- Understand foundational reference and MDM concepts based on the Data Management Body of Knowledge (DMBOK)
- Understand why these are an important component of your Data Architecture
- Gain awareness of Reference and MDM Frameworks and building blocks
- Know what MDM guiding principles consist of and best practices
- Know how to utilize reference and MDM in support of business strategy
Strategic Business Requirements for Master Data Management SystemsBoris Otto
This presentation describes strategic business requirements of master data management (MDM) systems. The requirements were developed in a consortium research approach by the Institute of Information Management at the University of St. Gallen, Switzerland, and 20 multinational enterprises.
The presentation was given at the 17th Amercias Conference on Information Systems (AMCIS 2011) in Detroit, MI.
The research paper on which this presentation is based on can be found here: http://www.alexandria.unisg.ch/Publikationen/Zitation/Boris_Otto/177697
Metadata is hotter than ever, according a number of recent DATAVERSITY surveys. More and more organizations are realizing that in order to drive business value from data, robust metadata is needed to gain the necessary context and lineage around key data assets. At the same time, industry regulations are driving the need for better transparency and understanding of information.
While metadata has been managed for decades, new strategies & approaches have been developed to support the ever-evolving data landscape, and provide more innovative ways to drive business value from metadata. This webinar will provide an overview of metadata strategies & technologies available to today’s organization, and provide insights into building successful business strategies for metadata adoption & use.
The document discusses strategies for managing master data through a Master Data Management (MDM) solution. It outlines challenges with current data management practices and goals for an improved MDM approach. Key considerations for implementing an effective MDM strategy include identifying initial data domains, use cases, source systems, consumers, and the appropriate MDM patterns to address business needs.
Describes what Enterprise Data Architecture in a Software Development Organization should cover and does that by listing over 200 data architecture related deliverables an Enterprise Data Architect should remember to evangelize.
This practical presentation will cover the most important and impactful artifacts and deliverables needed to implement and sustain governance. Rather than speak hypothetically about what output is needed from governance, it covers and reviews artifact templates to help you re-create them in your organization.
Topics covered:
- Which artifacts are most important to get started
- Important artifacts for more mature programs
- How to ensure the artifacts are used and implemented, not just written
- How to integrate governance artifacts into operational processes
- Who should be involved in creating the deliverables
Data Architecture Strategies: Data Architecture for Digital TransformationDATAVERSITY
MDM, data quality, data architecture, and more. At the same time, combining these foundational data management approaches with other innovative techniques can help drive organizational change as well as technological transformation. This webinar will provide practical steps for creating a data foundation for effective digital transformation.
The document provides an overview of the Databricks platform, which offers a unified environment for data engineering, analytics, and AI. It describes how Databricks addresses the complexity of managing data across siloed systems by providing a single "data lakehouse" platform where all data and analytics workloads can be run. Key features highlighted include Delta Lake for ACID transactions on data lakes, auto loader for streaming data ingestion, notebooks for interactive coding, and governance tools to securely share and catalog data and models.
To take a “ready, aim, fire” tactic to implement Data Governance, many organizations assess themselves against industry best practices. The process is not difficult or time-consuming and can directly assure that your activities target your specific needs. Best practices are always a strong place to start.
Join Bob Seiner for this popular RWDG topic, where he will provide the information you need to set your program in the best possible direction. Bob will walk you through the steps of conducting an assessment and share with you a set of typical results from taking this action. You may be surprised at how easy it is to organize the assessment and may hear results that stimulate the actions that you need to take.
In this webinar, Bob will share:
- The value of performing a Data Governance best practice assessment
- A practical list of industry Data Governance best practices
- Criteria to determine if a practice is best practice
- Steps to follow to complete an assessment
- Typical recommendations and actions that result from an assessment
Building a Data Strategy – Practical Steps for Aligning with Business GoalsDATAVERSITY
Developing a Data Strategy for your organization can seem like a daunting task – but it’s worth the effort. Getting your Data Strategy right can provide significant value, as data drives many of the key initiatives in today’s marketplace – from digital transformation, to marketing, to customer centricity, to population health, and more. This webinar will help demystify Data Strategy and its relationship to Data Architecture and will provide concrete, practical ways to get started.
The data architecture of solutions is frequently not given the attention it deserves or needs. Frequently, too little attention is paid to designing and specifying the data architecture within individual solutions and their constituent components. This is due to the behaviours of both solution architects ad data architects.
Solution architecture tends to concern itself with functional, technology and software components of the solution
Data architecture tends not to get involved with the data aspects of technology solutions, leaving a data architecture gap. Combined with the gap where data architecture tends not to get involved with the data aspects of technology solutions, there is also frequently a solution architecture data gap. Solution architecture also frequently omits the detail of data aspects of solutions leading to a solution data architecture gap. These gaps result in a data blind spot for the organisation.
Data architecture tends to concern itself with post-individual solutions. Data architecture needs to shift left into the domain of solutions and their data and more actively engage with the data dimensions of individual solutions. Data architecture can provide the lead in sealing these data gaps through a shift-left of its scope and activities as well providing standards and common data tooling for solution data architecture
The objective of data design for solutions is the same as that for overall solution design:
• To capture sufficient information to enable the solution design to be implemented
• To unambiguously define the data requirements of the solution and to confirm and agree those requirements with the target solution consumers
• To ensure that the implemented solution meets the requirements of the solution consumers and that no deviations have taken place during the solution implementation journey
Solution data architecture avoids problems with solution operation and use:
• Poor and inconsistent data quality
• Poor performance, throughput, response times and scalability
• Poorly designed data structures can lead to long data update times leading to long response times, affecting solution usability, loss of productivity and transaction abandonment
• Poor reporting and analysis
• Poor data integration
• Poor solution serviceability and maintainability
• Manual workarounds for data integration, data extract for reporting and analysis
Data-design-related solution problems frequently become evident and manifest themselves only after the solution goes live. The benefits of solution data architecture are not always evident initially.
How to Build & Sustain a Data Governance Operating Model DATUM LLC
Learn how to execute a data governance strategy through creation of a successful business case and operating model.
Originally presented to an audience of 400+ at the Master Data Management & Data Governance Summit.
Visit www.datumstrategy.com for more!
Data Architecture Strategies: Building an Enterprise Data Strategy – Where to...DATAVERSITY
The majority of successful organizations in today’s economy are data-driven, and innovative companies are looking at new ways to leverage data and information for strategic advantage. While the opportunities are vast, and the value has clearly been shown across a number of industries in using data to strategic advantage, the choices in technology can be overwhelming. From Big Data to Artificial Intelligence to Data Lakes and Warehouses, the industry is continually evolving to provide new and exciting technological solutions.
This webinar will help make sense of the various data architectures & technologies available, and how to leverage them for business value and success. A practical framework will be provided to generate “quick wins” for your organization, while at the same time building towards a longer-term sustainable architecture. Case studies will also be provided to show how successful organizations have successfully built a data strategies to support their business goals.
LDM Webinar: Data Modeling & Business IntelligenceDATAVERSITY
Business Intelligence (BI) is a valuable way to use information to show the overall health and performance of the organization. At its core is quality, well-structured data that allows for successful reporting and analytics. A data model helps provide both the business definitions as well as the structural optimization needed for successful BI implementations.
Join this webinar to see how a data model underpins business intelligence and analytics in today’s organization.
Data Lake Architecture – Modern Strategies & ApproachesDATAVERSITY
Data Lake or Data Swamp? By now, we’ve likely all heard the comparison. Data Lake architectures have the opportunity to provide the ability to integrate vast amounts of disparate data across the organization for strategic business analytic value. But without a proper architecture and metadata management strategy in place, a Data Lake can quickly devolve into a swamp of information that is difficult to understand. This webinar will offer practical strategies to architect and manage your Data Lake in a way that optimizes its success.
Enterprise Architecture vs. Data ArchitectureDATAVERSITY
Enterprise Architecture (EA) provides a visual blueprint of the organization, and shows key interrelationships between data, process, applications, and more. By abstracting these assets in a graphical view, it’s possible to see key interrelationships, particularly as they relate to data and its business impact across the organization. Join us for a discussion on how Data Architecture is a key component of an overall Enterprise Architecture for enhanced business value and success.
This presentation reports on data governance best practices. Based on a definition of fundamental terms and the business rationale for data governance, a set of case studies from leading companies is presented. The content of this presentation is a result of the Competence Center Corporate Data Quality (CC CDQ) at the University of St. Gallen, Switzerland.
Data Architecture - The Foundation for Enterprise Architecture and GovernanceDATAVERSITY
Organizations are faced with an increasingly complex data landscape, finding themselves unable to cope with exponentially increasing data volumes, compounded by additional regulatory requirements with increased fines for non-compliance. Enterprise architecture and data governance are often discussed at length, but often with different stakeholder audiences. This can result in complementary and sometimes conflicting initiatives rather than a focused, integrated approach. Data governance requires a solid data architecture foundation in order to support the pillars of enterprise architecture. In this session, IDERA’s Ron Huizenga will discuss a practical, integrated approach to effectively understand, define and implement an cohesive enterprise architecture and data governance discipline with integrated modeling and metadata management.
Tackling Data Quality problems requires more than a series of tactical, one-off improvement projects. By their nature, many Data Quality problems extend across and often beyond an organization. Addressing these issues requires a holistic architectural approach combining people, process, and technology. Join Nigel Turner and Donna Burbank as they provide practical ways to control Data Quality issues in your organization.
Business Value Through Reference and Master Data StrategiesDATAVERSITY
Data tends to pile up and can be rendered unusable or obsolete without careful maintenance processes. Reference and Master Data Management (MDM) has been a popular Data Management approach to effectively gain mastery over not just the data but the supporting architecture for processing it. This webinar presents MDM as a strategic approach to improving and formalizing practices around those data items that provide context for many organizational transactions — the master data. Too often, MDM has been implemented technology-first and achieved the same very poor track record (one-third succeeding on time, within budget, and achieving planned functionality). MDM success depends on a coordinated approach, typically involving Data Governance and Data Quality activities.
Learning Objectives:
• Understand foundational reference and MDM concepts based on the Data Management Body of Knowledge (DMBoK)
• Understand why these are an important component of your Data Architecture
• Gain awareness of reference and MDM frameworks and building blocks
• Know what MDM guiding principles consist of and best practices
• Know how to utilize reference and MDM in support of business strategy
The article is intended as a quick overview of what effective master data management means in today’s business context in terms of risks, challenges and opportunities for companies and decision makers. The article is structured in two main areas, which cover in turn the importance of an effective master data
management implementation and the methodology to get there.
The document discusses strategies for implementing master data management (MDM) across an enterprise. It warns against technology-focused strategies that advocate starting small with a single data type or deployment style. Such limited initial approaches cannot solve important business problems and limit the potential return on investment from MDM. Instead, the document recommends a business-focused strategy that addresses key business needs, which can then enable successful expansion of MDM across the organization. Examples are provided of companies that leveraged MDM to solve pressing divisional problems and then expanded its use more broadly.
Essential Reference and Master Data ManagementDATAVERSITY
Data tends to pile up and can be rendered unusable or obsolete without careful maintenance processes. Reference and Master Data Management (MDM) has been a popular Data Management approach to effectively gain mastery over not just the data but the supporting architecture for processing it. This webinar presents MDM as a strategic approach to improving and formalizing practices around those data items that provide context for many organizational transactions: its master data. Too often, MDM has been implemented technology-first and achieved the same very poor track record (one-third succeeding on-time, within budget, and achieving planned functionality). MDM success depends on a coordinated approach typically involving Data Governance and Data Quality activities.
Learning objectives:
- Understand foundational reference and MDM concepts based on the Data Management Body of Knowledge (DMBOK)
- Understand why these are an important component of your Data Architecture
- Gain awareness of Reference and MDM Frameworks and building blocks
- Know what MDM guiding principles consist of and best practices
- Know how to utilize reference and MDM in support of business strategy
Strategic Business Requirements for Master Data Management SystemsBoris Otto
This presentation describes strategic business requirements of master data management (MDM) systems. The requirements were developed in a consortium research approach by the Institute of Information Management at the University of St. Gallen, Switzerland, and 20 multinational enterprises.
The presentation was given at the 17th Amercias Conference on Information Systems (AMCIS 2011) in Detroit, MI.
The research paper on which this presentation is based on can be found here: http://www.alexandria.unisg.ch/Publikationen/Zitation/Boris_Otto/177697
Metadata is hotter than ever, according a number of recent DATAVERSITY surveys. More and more organizations are realizing that in order to drive business value from data, robust metadata is needed to gain the necessary context and lineage around key data assets. At the same time, industry regulations are driving the need for better transparency and understanding of information.
While metadata has been managed for decades, new strategies & approaches have been developed to support the ever-evolving data landscape, and provide more innovative ways to drive business value from metadata. This webinar will provide an overview of metadata strategies & technologies available to today’s organization, and provide insights into building successful business strategies for metadata adoption & use.
The document discusses strategies for managing master data through a Master Data Management (MDM) solution. It outlines challenges with current data management practices and goals for an improved MDM approach. Key considerations for implementing an effective MDM strategy include identifying initial data domains, use cases, source systems, consumers, and the appropriate MDM patterns to address business needs.
Describes what Enterprise Data Architecture in a Software Development Organization should cover and does that by listing over 200 data architecture related deliverables an Enterprise Data Architect should remember to evangelize.
This practical presentation will cover the most important and impactful artifacts and deliverables needed to implement and sustain governance. Rather than speak hypothetically about what output is needed from governance, it covers and reviews artifact templates to help you re-create them in your organization.
Topics covered:
- Which artifacts are most important to get started
- Important artifacts for more mature programs
- How to ensure the artifacts are used and implemented, not just written
- How to integrate governance artifacts into operational processes
- Who should be involved in creating the deliverables
Data Architecture Strategies: Data Architecture for Digital TransformationDATAVERSITY
MDM, data quality, data architecture, and more. At the same time, combining these foundational data management approaches with other innovative techniques can help drive organizational change as well as technological transformation. This webinar will provide practical steps for creating a data foundation for effective digital transformation.
The document provides an overview of the Databricks platform, which offers a unified environment for data engineering, analytics, and AI. It describes how Databricks addresses the complexity of managing data across siloed systems by providing a single "data lakehouse" platform where all data and analytics workloads can be run. Key features highlighted include Delta Lake for ACID transactions on data lakes, auto loader for streaming data ingestion, notebooks for interactive coding, and governance tools to securely share and catalog data and models.
To take a “ready, aim, fire” tactic to implement Data Governance, many organizations assess themselves against industry best practices. The process is not difficult or time-consuming and can directly assure that your activities target your specific needs. Best practices are always a strong place to start.
Join Bob Seiner for this popular RWDG topic, where he will provide the information you need to set your program in the best possible direction. Bob will walk you through the steps of conducting an assessment and share with you a set of typical results from taking this action. You may be surprised at how easy it is to organize the assessment and may hear results that stimulate the actions that you need to take.
In this webinar, Bob will share:
- The value of performing a Data Governance best practice assessment
- A practical list of industry Data Governance best practices
- Criteria to determine if a practice is best practice
- Steps to follow to complete an assessment
- Typical recommendations and actions that result from an assessment
Building a Data Strategy – Practical Steps for Aligning with Business GoalsDATAVERSITY
Developing a Data Strategy for your organization can seem like a daunting task – but it’s worth the effort. Getting your Data Strategy right can provide significant value, as data drives many of the key initiatives in today’s marketplace – from digital transformation, to marketing, to customer centricity, to population health, and more. This webinar will help demystify Data Strategy and its relationship to Data Architecture and will provide concrete, practical ways to get started.
The data architecture of solutions is frequently not given the attention it deserves or needs. Frequently, too little attention is paid to designing and specifying the data architecture within individual solutions and their constituent components. This is due to the behaviours of both solution architects ad data architects.
Solution architecture tends to concern itself with functional, technology and software components of the solution
Data architecture tends not to get involved with the data aspects of technology solutions, leaving a data architecture gap. Combined with the gap where data architecture tends not to get involved with the data aspects of technology solutions, there is also frequently a solution architecture data gap. Solution architecture also frequently omits the detail of data aspects of solutions leading to a solution data architecture gap. These gaps result in a data blind spot for the organisation.
Data architecture tends to concern itself with post-individual solutions. Data architecture needs to shift left into the domain of solutions and their data and more actively engage with the data dimensions of individual solutions. Data architecture can provide the lead in sealing these data gaps through a shift-left of its scope and activities as well providing standards and common data tooling for solution data architecture
The objective of data design for solutions is the same as that for overall solution design:
• To capture sufficient information to enable the solution design to be implemented
• To unambiguously define the data requirements of the solution and to confirm and agree those requirements with the target solution consumers
• To ensure that the implemented solution meets the requirements of the solution consumers and that no deviations have taken place during the solution implementation journey
Solution data architecture avoids problems with solution operation and use:
• Poor and inconsistent data quality
• Poor performance, throughput, response times and scalability
• Poorly designed data structures can lead to long data update times leading to long response times, affecting solution usability, loss of productivity and transaction abandonment
• Poor reporting and analysis
• Poor data integration
• Poor solution serviceability and maintainability
• Manual workarounds for data integration, data extract for reporting and analysis
Data-design-related solution problems frequently become evident and manifest themselves only after the solution goes live. The benefits of solution data architecture are not always evident initially.
How to Build & Sustain a Data Governance Operating Model DATUM LLC
Learn how to execute a data governance strategy through creation of a successful business case and operating model.
Originally presented to an audience of 400+ at the Master Data Management & Data Governance Summit.
Visit www.datumstrategy.com for more!
Data Architecture Strategies: Building an Enterprise Data Strategy – Where to...DATAVERSITY
The majority of successful organizations in today’s economy are data-driven, and innovative companies are looking at new ways to leverage data and information for strategic advantage. While the opportunities are vast, and the value has clearly been shown across a number of industries in using data to strategic advantage, the choices in technology can be overwhelming. From Big Data to Artificial Intelligence to Data Lakes and Warehouses, the industry is continually evolving to provide new and exciting technological solutions.
This webinar will help make sense of the various data architectures & technologies available, and how to leverage them for business value and success. A practical framework will be provided to generate “quick wins” for your organization, while at the same time building towards a longer-term sustainable architecture. Case studies will also be provided to show how successful organizations have successfully built a data strategies to support their business goals.
LDM Webinar: Data Modeling & Business IntelligenceDATAVERSITY
Business Intelligence (BI) is a valuable way to use information to show the overall health and performance of the organization. At its core is quality, well-structured data that allows for successful reporting and analytics. A data model helps provide both the business definitions as well as the structural optimization needed for successful BI implementations.
Join this webinar to see how a data model underpins business intelligence and analytics in today’s organization.
Data Lake Architecture – Modern Strategies & ApproachesDATAVERSITY
Data Lake or Data Swamp? By now, we’ve likely all heard the comparison. Data Lake architectures have the opportunity to provide the ability to integrate vast amounts of disparate data across the organization for strategic business analytic value. But without a proper architecture and metadata management strategy in place, a Data Lake can quickly devolve into a swamp of information that is difficult to understand. This webinar will offer practical strategies to architect and manage your Data Lake in a way that optimizes its success.
Enterprise Architecture vs. Data ArchitectureDATAVERSITY
Enterprise Architecture (EA) provides a visual blueprint of the organization, and shows key interrelationships between data, process, applications, and more. By abstracting these assets in a graphical view, it’s possible to see key interrelationships, particularly as they relate to data and its business impact across the organization. Join us for a discussion on how Data Architecture is a key component of an overall Enterprise Architecture for enhanced business value and success.
This presentation reports on data governance best practices. Based on a definition of fundamental terms and the business rationale for data governance, a set of case studies from leading companies is presented. The content of this presentation is a result of the Competence Center Corporate Data Quality (CC CDQ) at the University of St. Gallen, Switzerland.
Data Architecture - The Foundation for Enterprise Architecture and GovernanceDATAVERSITY
Organizations are faced with an increasingly complex data landscape, finding themselves unable to cope with exponentially increasing data volumes, compounded by additional regulatory requirements with increased fines for non-compliance. Enterprise architecture and data governance are often discussed at length, but often with different stakeholder audiences. This can result in complementary and sometimes conflicting initiatives rather than a focused, integrated approach. Data governance requires a solid data architecture foundation in order to support the pillars of enterprise architecture. In this session, IDERA’s Ron Huizenga will discuss a practical, integrated approach to effectively understand, define and implement an cohesive enterprise architecture and data governance discipline with integrated modeling and metadata management.
Tackling Data Quality problems requires more than a series of tactical, one-off improvement projects. By their nature, many Data Quality problems extend across and often beyond an organization. Addressing these issues requires a holistic architectural approach combining people, process, and technology. Join Nigel Turner and Donna Burbank as they provide practical ways to control Data Quality issues in your organization.
Business Value Through Reference and Master Data StrategiesDATAVERSITY
Data tends to pile up and can be rendered unusable or obsolete without careful maintenance processes. Reference and Master Data Management (MDM) has been a popular Data Management approach to effectively gain mastery over not just the data but the supporting architecture for processing it. This webinar presents MDM as a strategic approach to improving and formalizing practices around those data items that provide context for many organizational transactions — the master data. Too often, MDM has been implemented technology-first and achieved the same very poor track record (one-third succeeding on time, within budget, and achieving planned functionality). MDM success depends on a coordinated approach, typically involving Data Governance and Data Quality activities.
Learning Objectives:
• Understand foundational reference and MDM concepts based on the Data Management Body of Knowledge (DMBoK)
• Understand why these are an important component of your Data Architecture
• Gain awareness of reference and MDM frameworks and building blocks
• Know what MDM guiding principles consist of and best practices
• Know how to utilize reference and MDM in support of business strategy
The article is intended as a quick overview of what effective master data management means in today’s business context in terms of risks, challenges and opportunities for companies and decision makers. The article is structured in two main areas, which cover in turn the importance of an effective master data
management implementation and the methodology to get there.
The document discusses strategies for implementing master data management (MDM) across an enterprise. It warns against technology-focused strategies that advocate starting small with a single data type or deployment style. Such limited initial approaches cannot solve important business problems and limit the potential return on investment from MDM. Instead, the document recommends a business-focused strategy that addresses key business needs, which can then enable successful expansion of MDM across the organization. Examples are provided of companies that leveraged MDM to solve pressing divisional problems and then expanded its use more broadly.
White Paper - The Business Case For Business IntelligenceDavid Walker
This white paper looks at the business case that should lie behind the decision to build a data warehouse and provide a business intelligence solution.
There are three primary drivers for making the investment in a business intelligence solution
1. Measurement and management of the business process
2. Analysis of why things change in the business in order to react better in the future
3. Providing information for stakeholders
As a consequence of the investment there will also be a number of secondary benefits that will help to justify the investment and these are also discussed. Finally there are a number of ‘anti-drivers’ – reasons for not embarking on a business intelligence programme.
Data Governance a Business Value Driven ApproachTridant
This white paper proposes a data governance framework focused on generating business value from enterprise data. The framework includes a data excellence maturity model to assess an organization's ability to leverage data, a data excellence framework with four pillars of agility, trust, intelligence and transparency, and defines data governance through business rules linked to specific business processes and metrics. The goal is to deliver both immediate improvements and long term sustainable management of enterprise data as a business asset.
This presentation will cover the definition of Master Data Management, describe potential MDM hub architectures, outline 5 essential elements of MDM, and describe 11 real-world best practices for MDM and data governance, based on years of experience in the field.
Enterprise-Level Preparation for Master Data Management.pdfAmeliaWong21
Master Data Management (MDM) continues to play a foundational role in the Data Management Architecture of every 21st century enterprise. In a forward-looking organization, MDM is significant in the Enterprise Integration Hub.
This document discusses strategies for simplifying analytics. It recommends focusing on insights that are important for customers, stakeholders, and employees rather than trying to analyze all possible data. Specific strategies include using next-gen business intelligence and data visualization to improve decision making; applying data discovery techniques to uncover patterns; deploying analytics applications to simplify advanced analytics; and harnessing techniques like machine learning to reduce human effort and produce predictions. The goal is to generate insights that lead to tangible outcomes through a faster, simpler analytical approach.
This document provides an overview of master data management (MDM) and describes how the Kalido MDM solution addresses common MDM challenges. It discusses how master data is critical context for business processes and analytics but is often inconsistent across systems. The Kalido MDM solution enables businesses to consolidate, govern, and maintain master data through capabilities like managing all data domains, sophisticated data modeling, a master data authoring environment, life cycle management, data matching, hierarchy management, and embedded workflow to guide the data stewardship process.
In this presentation, we'll help you better understand Master Data Management (MDM) and data governance, present some useful MDM and data governance best practices, talk about what works and what doesn’t, cover the importance of a holistic approach, and discuss how to get the political aspects right.
Rob Karel - Ensuring The Value Of Your Trusted Data - Data Quality Summit 2008DataValueTalk
- The document discusses building a business case for trusted data and master data management (MDM) initiatives through a bottom-up valuation approach. It recommends starting with an individual line-of-business process to identify and address data quality issues to quickly realize value.
- Examples of target processes include reducing call center inefficiencies through better customer data, decreasing wasted marketing costs from improved targeting, and lowering supply chain breakdowns by ensuring data integrity. Metrics like data freshness, accuracy, and completeness should be used to ensure initiatives are on track.
- A multi-phase, long-term view of data governance as a "trusted data program" is advocated over viewing MDM as the goal in itself. Buy-
This document discusses strategies for master data management (MDM), including various MDM architectures and data synchronization techniques. It begins by defining key MDM concepts like master data, data domains, and MDM approaches. It then presents four business cases where MDM could provide value. The document goes on to describe three common MDM architectures: single central repository, central hub and spoke, and virtual integration. It also discusses data synchronization techniques like trigger-based and message-based approaches. Finally, it provides a case study example of how MDM could solve a data management problem.
Why Master Data Management Projects Fail and what this means for Big DataSam Thomsett
This document discusses why Master Data Management (MDM) projects often fail and the implications for big data initiatives. Some key reasons for MDM project failures include a lack of enterprise thinking and executive sponsorship, weak business cases, treating MDM as an IT solution rather than business solution, unrealistic roadmaps, and poor communications planning. The document argues that establishing a data governance strategy, enterprise reference architecture, and prioritized project roadmap are important for MDM and big data success.
This whitepaper discusses holistic data governance and provides a framework for organizations to establish effective data governance. It argues that data governance should be treated as a core business function, rather than an IT project, in order to optimize value from data assets. The framework outlines 10 facets of data governance, including vision/business case, people, tools/architecture, policies, organizational alignment, measurement, change management, dependent processes, program management and defined processes. It also defines levels of data governance maturity from initial/ad hoc to optimized/function in order to help organizations assess their current state and develop a roadmap.
The document discusses how companies can make advanced analytics work for them. It provides several guides for managers, including identifying the right data sources, building simple analytics models focused on business goals, and developing tools everyone can understand. While acquiring big data is important, companies must transform their culture and capabilities to develop business-relevant analytics that can optimize outcomes. Executing analytics properly requires a flexible approach and cultural shift within the organization.
Master Data Management (MDM) for Mid-MarketVivek Mishra
This document discusses master data management (MDM) solutions for mid-sized businesses. It begins by introducing MDM and its benefits, such as a single source of truth and improved data quality. It then outlines some common challenges for implementing MDM in mid-sized companies, such as high costs, maintaining multiple data domains, and the need for organizational change management. The document provides advice on overcoming these challenges by selecting flexible and affordable MDM platforms. It also describes key properties of effective MDM, like ongoing data governance and providing a 360-degree view of customers. Finally, it introduces Compunnel Digital's partnership with Profisee to offer modernized MDM solutions tailored for mid-sized organizations.
This document discusses the importance of organizational readiness and cross-functional collaboration for successful Master Data Management (MDM) implementation. MDM requires buy-in across the organization and treating data as a shared resource rather than isolated in departments. The document recommends developing a shared vision, comprehensive strategy, and governance structure to manage MDM. Effective change management and open communication are also key to overcoming resistance and ensuring all stakeholders contribute to high quality master data.
Master Data Management's Place in the Data Governance Landscape CCG
This document provides an overview of master data management and how it relates to data governance. It defines key concepts like master data, reference data, and different master data management architectural models. It discusses how master data management aligns with and supports data governance objectives. Specifically, it notes that MDM should not be implemented without formal data quality and governance programs already in place. It also explains how various data governance functions like ownership, policies and standards apply to master data.
Emergency Medical Association is a group of 250 emergency physicians responsible for effectively managing emergency departments. They should implement business intelligence values and applications like data-driven decision making to improve patient outcomes, reduce costs, and ensure future success.
Real-time business intelligence involves delivering business operation information as it occurs. Credit card companies use real-time BI by approving purchase amounts via mobile to retain customers longer. The process involves feeding transactions to a system maintaining the current enterprise state in real-time for tactical decisions alongside classic strategic functions.
Business intelligence plays a key role in modern business by processing vast information into understandable formats for strategic, tactical, and operational decision making. This helps decision makers faced with information overload and inconsistent data.
Dear students get fully solved assignments
Send your semester & Specialization name to our mail id :
“ help.mbaassignments@gmail.com ”
or
Call us at : 08263069601
(Prefer mailing. Call in emergency )
Similar to Master data management executive mdm buy in business case (2) (20)
[OReilly Superstream] Occupy the Space: A grassroots guide to engineering (an...Jason Yip
The typical problem in product engineering is not bad strategy, so much as “no strategy”. This leads to confusion, lack of motivation, and incoherent action. The next time you look for a strategy and find an empty space, instead of waiting for it to be filled, I will show you how to fill it in yourself. If you’re wrong, it forces a correction. If you’re right, it helps create focus. I’ll share how I’ve approached this in the past, both what works and lessons for what didn’t work so well.
"Scaling RAG Applications to serve millions of users", Kevin GoedeckeFwdays
How we managed to grow and scale a RAG application from zero to thousands of users in 7 months. Lessons from technical challenges around managing high load for LLMs, RAGs and Vector databases.
Main news related to the CCS TSI 2023 (2023/1695)Jakub Marek
An English 🇬🇧 translation of a presentation to the speech I gave about the main changes brought by CCS TSI 2023 at the biggest Czech conference on Communications and signalling systems on Railways, which was held in Clarion Hotel Olomouc from 7th to 9th November 2023 (konferenceszt.cz). Attended by around 500 participants and 200 on-line followers.
The original Czech 🇨🇿 version of the presentation can be found here: https://www.slideshare.net/slideshow/hlavni-novinky-souvisejici-s-ccs-tsi-2023-2023-1695/269688092 .
The videorecording (in Czech) from the presentation is available here: https://youtu.be/WzjJWm4IyPk?si=SImb06tuXGb30BEH .
High performance Serverless Java on AWS- GoTo Amsterdam 2024Vadym Kazulkin
Java is for many years one of the most popular programming languages, but it used to have hard times in the Serverless community. Java is known for its high cold start times and high memory footprint, comparing to other programming languages like Node.js and Python. In this talk I'll look at the general best practices and techniques we can use to decrease memory consumption, cold start times for Java Serverless development on AWS including GraalVM (Native Image) and AWS own offering SnapStart based on Firecracker microVM snapshot and restore and CRaC (Coordinated Restore at Checkpoint) runtime hooks. I'll also provide a lot of benchmarking on Lambda functions trying out various deployment package sizes, Lambda memory settings, Java compilation options and HTTP (a)synchronous clients and measure their impact on cold and warm start times.
Connector Corner: Seamlessly power UiPath Apps, GenAI with prebuilt connectorsDianaGray10
Join us to learn how UiPath Apps can directly and easily interact with prebuilt connectors via Integration Service--including Salesforce, ServiceNow, Open GenAI, and more.
The best part is you can achieve this without building a custom workflow! Say goodbye to the hassle of using separate automations to call APIs. By seamlessly integrating within App Studio, you can now easily streamline your workflow, while gaining direct access to our Connector Catalog of popular applications.
We’ll discuss and demo the benefits of UiPath Apps and connectors including:
Creating a compelling user experience for any software, without the limitations of APIs.
Accelerating the app creation process, saving time and effort
Enjoying high-performance CRUD (create, read, update, delete) operations, for
seamless data management.
Speakers:
Russell Alfeche, Technology Leader, RPA at qBotic and UiPath MVP
Charlie Greenberg, host
Monitoring and Managing Anomaly Detection on OpenShift.pdfTosin Akinosho
Monitoring and Managing Anomaly Detection on OpenShift
Overview
Dive into the world of anomaly detection on edge devices with our comprehensive hands-on tutorial. This SlideShare presentation will guide you through the entire process, from data collection and model training to edge deployment and real-time monitoring. Perfect for those looking to implement robust anomaly detection systems on resource-constrained IoT/edge devices.
Key Topics Covered
1. Introduction to Anomaly Detection
- Understand the fundamentals of anomaly detection and its importance in identifying unusual behavior or failures in systems.
2. Understanding Edge (IoT)
- Learn about edge computing and IoT, and how they enable real-time data processing and decision-making at the source.
3. What is ArgoCD?
- Discover ArgoCD, a declarative, GitOps continuous delivery tool for Kubernetes, and its role in deploying applications on edge devices.
4. Deployment Using ArgoCD for Edge Devices
- Step-by-step guide on deploying anomaly detection models on edge devices using ArgoCD.
5. Introduction to Apache Kafka and S3
- Explore Apache Kafka for real-time data streaming and Amazon S3 for scalable storage solutions.
6. Viewing Kafka Messages in the Data Lake
- Learn how to view and analyze Kafka messages stored in a data lake for better insights.
7. What is Prometheus?
- Get to know Prometheus, an open-source monitoring and alerting toolkit, and its application in monitoring edge devices.
8. Monitoring Application Metrics with Prometheus
- Detailed instructions on setting up Prometheus to monitor the performance and health of your anomaly detection system.
9. What is Camel K?
- Introduction to Camel K, a lightweight integration framework built on Apache Camel, designed for Kubernetes.
10. Configuring Camel K Integrations for Data Pipelines
- Learn how to configure Camel K for seamless data pipeline integrations in your anomaly detection workflow.
11. What is a Jupyter Notebook?
- Overview of Jupyter Notebooks, an open-source web application for creating and sharing documents with live code, equations, visualizations, and narrative text.
12. Jupyter Notebooks with Code Examples
- Hands-on examples and code snippets in Jupyter Notebooks to help you implement and test anomaly detection models.
Northern Engraving | Nameplate Manufacturing Process - 2024Northern Engraving
Manufacturing custom quality metal nameplates and badges involves several standard operations. Processes include sheet prep, lithography, screening, coating, punch press and inspection. All decoration is completed in the flat sheet with adhesive and tooling operations following. The possibilities for creating unique durable nameplates are endless. How will you create your brand identity? We can help!
Skybuffer SAM4U tool for SAP license adoptionTatiana Kojar
Manage and optimize your license adoption and consumption with SAM4U, an SAP free customer software asset management tool.
SAM4U, an SAP complimentary software asset management tool for customers, delivers a detailed and well-structured overview of license inventory and usage with a user-friendly interface. We offer a hosted, cost-effective, and performance-optimized SAM4U setup in the Skybuffer Cloud environment. You retain ownership of the system and data, while we manage the ABAP 7.58 infrastructure, ensuring fixed Total Cost of Ownership (TCO) and exceptional services through the SAP Fiori interface.
inQuba Webinar Mastering Customer Journey Management with Dr Graham HillLizaNolte
HERE IS YOUR WEBINAR CONTENT! 'Mastering Customer Journey Management with Dr. Graham Hill'. We hope you find the webinar recording both insightful and enjoyable.
In this webinar, we explored essential aspects of Customer Journey Management and personalization. Here’s a summary of the key insights and topics discussed:
Key Takeaways:
Understanding the Customer Journey: Dr. Hill emphasized the importance of mapping and understanding the complete customer journey to identify touchpoints and opportunities for improvement.
Personalization Strategies: We discussed how to leverage data and insights to create personalized experiences that resonate with customers.
Technology Integration: Insights were shared on how inQuba’s advanced technology can streamline customer interactions and drive operational efficiency.
Dandelion Hashtable: beyond billion requests per second on a commodity serverAntonios Katsarakis
This slide deck presents DLHT, a concurrent in-memory hashtable. Despite efforts to optimize hashtables, that go as far as sacrificing core functionality, state-of-the-art designs still incur multiple memory accesses per request and block request processing in three cases. First, most hashtables block while waiting for data to be retrieved from memory. Second, open-addressing designs, which represent the current state-of-the-art, either cannot free index slots on deletes or must block all requests to do so. Third, index resizes block every request until all objects are copied to the new index. Defying folklore wisdom, DLHT forgoes open-addressing and adopts a fully-featured and memory-aware closed-addressing design based on bounded cache-line-chaining. This design offers lock-free index operations and deletes that free slots instantly, (2) completes most requests with a single memory access, (3) utilizes software prefetching to hide memory latencies, and (4) employs a novel non-blocking and parallel resizing. In a commodity server and a memory-resident workload, DLHT surpasses 1.6B requests per second and provides 3.5x (12x) the throughput of the state-of-the-art closed-addressing (open-addressing) resizable hashtable on Gets (Deletes).
Essentials of Automations: Exploring Attributes & Automation ParametersSafe Software
Building automations in FME Flow can save time, money, and help businesses scale by eliminating data silos and providing data to stakeholders in real-time. One essential component to orchestrating complex automations is the use of attributes & automation parameters (both formerly known as “keys”). In fact, it’s unlikely you’ll ever build an Automation without using these components, but what exactly are they?
Attributes & automation parameters enable the automation author to pass data values from one automation component to the next. During this webinar, our FME Flow Specialists will cover leveraging the three types of these output attributes & parameters in FME Flow: Event, Custom, and Automation. As a bonus, they’ll also be making use of the Split-Merge Block functionality.
You’ll leave this webinar with a better understanding of how to maximize the potential of automations by making use of attributes & automation parameters, with the ultimate goal of setting your enterprise integration workflows up on autopilot.
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdfChart Kalyan
A Mix Chart displays historical data of numbers in a graphical or tabular form. The Kalyan Rajdhani Mix Chart specifically shows the results of a sequence of numbers over different periods.
What is an RPA CoE? Session 1 – CoE VisionDianaGray10
In the first session, we will review the organization's vision and how this has an impact on the COE Structure.
Topics covered:
• The role of a steering committee
• How do the organization’s priorities determine CoE Structure?
Speaker:
Chris Bolin, Senior Intelligent Automation Architect Anika Systems
Biomedical Knowledge Graphs for Data Scientists and Bioinformaticians
Master data management executive mdm buy in business case (2)
1. White Paper
Recipe for C-Level Buy-In to MDM
Three actions expose hidden costs of bad data, quantify revenue gains from a
data governance investment, and sell MDM to the business
2. This document contains Confidential, Proprietary and Trade Secret Information (“Confidential
Information”) of Informatica Corporation and may not be copied, distributed, duplicated, or otherwise
reproduced in any manner without the prior written consent of Informatica.
While every attempt has been made to ensure that the information in this document is accurate and
complete, some typographical errors or technical inaccuracies may exist. Informatica does not accept
responsibility for any kind of loss resulting from the use of information contained in this document.
The information contained in this document is subject to change without notice.
The incorporation of the product attributes discussed in these materials into any release or upgrade of
any Informatica software product—as well as the timing of any such release or upgrade—is at the sole
discretion of Informatica.
Protected by one or more of the following U.S. Patents: 6,032,158; 5,794,246; 6,014,670;
6,339,775; 6,044,374; 6,208,990; 6,208,990; 6,850,947; 6,895,471; or by the following
pending U.S. Patents: 09/644,280; 10/966,046; 10/727,700.
This edition published March 2014
4. 2
Executive Summary
In The Conference Board CEO Challenge 20141
survey, executives were asked to identify and rank the most
pressing challenges they face and their strategies for addressing each. Leading challenges included human
capital, customer relationships, innovation, operational excellence, and corporate brand and reputation.
Curiously, “master data” was not listed.
Few business executives may be aware of master data management (MDM) technology per se2
, but IT
professionals tasked with developing and implementing enterprise information management strategies
clearly recognize that the referential integrity of master data is a critical underpinning of nearly any initiative
focused on employees, customers, products, suppliers, and other CEO priorities. A common challenge for IT
professionals is measuring and communicating the economic value of MDM and related data governance
investments in a way that will trigger funding and sponsorship from the executive suite. “There are far too
many efforts where the IT team is pushing the MDM idea, and there isn’t enough pull from the business side
because there isn’t a clear articulation of what the business outcome will be,” says Gartner analyst
Ted Friedman.3
Many executives know they have “a data problem.” Unfortunately, they rarely recognize it as such or
understand the magnitude of the issue and its adverse finance impact. IT professionals have a much deeper
understanding of the nature and extent of problems associated with inconsistent master data, but they, too,
are often unable to quantify the problem in financial terms. Which makes it very difficult to present an
investment proposal.
In this white paper, we’ll examine the challenge of quantifying the cost of managing master data and
prescribe three concrete actions you as an IT professional can take to build the business case.
The Hidden Costs of Bad Data
Every seasoned manager has experienced master data management problems in one form or another. Sales
managers are commonly confronted with regional forecasts that don’t add up to the consolidated figure used
by corporate finance. HR executives in global organizations that operate in multiple countries and employ a
diverse mix of contractors, part-time and full time workers often struggle with the most basic issue of having an
accurate and consistent view of headcount.
Although it’s easy to gain executive agreement on the need to improve data quality, it is more difficult to
secure approval for significant MDM investments required to solve the problem. There are two fundamental
reasons for this.
1
The Conference Board CEO Challenge 2014
2
Are LOB Executives involved in MDM or is it Mostly an IT Initiative? 2011 Informatica blog post
3
“Master Plan: Getting Your Money’s Worth from MDM” TechTarget 2013
5. 3Recipe for C-Level Buy-In to MDM
First, many of the costs associated with fragmented data are not obvious. These hidden
costs can come in many forms. For example, choosing a supplier based exclusively on the
lowest price can be more expensive than choosing a supplier that delivers cost savings
through faster delivery times, better trade discounts, lower rejection rates, and lower
shipping expenses. Billing errors due to bad data are manifested in uncollectable debt
write offs, diminished cash flow from delayed payments, and customer abandonment. Data
quality assessments that include interviews with individuals engaged in business processes
where master data is generated and consumed frequently uncover comments like these:
“A $50 part can cost us $100,000 in chargebacks”
“I wish I would be able to ensure we don’t over-survey the same customer,
especially if they just had a negative experience”
“We have a hard time marrying design with build cost with sub-parts
maintenance cost”
A second obstacle to gaining approval for an MDM investment is that executives rarely
understand the complexities of the underlying IT infrastructure. It’s tempting for many to
believe that the problem can easily and inexpensively be “fixed” in a single system. For
example, executives may perceive that all relevant customer data resides in the CRM
system, while in fact, customer data may also reside in marketing automation systems,
billing systems, financial planning systems, data warehouses, and business intelligence
systems. In each system, customers are likely defined and structured differently, and multiple
records commonly exist for the same customer across the various systems.
Building the Business Case
The process of quantifying the costs of inconsistent and/or duplicitous master data and
developing a business case for MDM is as important as the business case itself. It is
essential to focus the effort on key challenges of high importance to both IT and business
and to take a collaborative approach that includes all stakeholders and follows an
objective methodology.
How to quantify costs and make the case? Start with a hypothesis derived from preliminary
investigation, followed by structured research and analysis. The research and analysis
should include an examination of the current state of your business processes and IT
infrastructure, an assessment of the desired state, and a definition of metrics that
define success.
MAKING THE CASE FOR
MASTER DATA AUTHORITY
Focus on three steps to
gain approval for an MDM
investment:
1.Explore processes and people.
Identify key business and
IT stakeholders
2.Get curious. Conduct
discovery on their business
processes, data drags, and
ideal scenarios—and how they
measure success
3.Analyze and quantify benefits.
Analyze your findings and
build your case
6. 4
Identify and Organize the Stakeholders
Ensuring alignment between IT and business is crucial. Both groups have vital contributions
to make, and both hold stakes in the success of the project. IT has a technological
understanding of how to improve the accuracy and relevance of master data. Business
owners understand the business processes and how they influence performance metrics. To
bridge the gap, IT leaders must demonstrate how improving master data solves business
process problems.
For example, a major US energy company embarked on an MDM program to address
challenges associated with the configuration and location of hundreds of thousands
of assets it designs, builds, and maintains. The IT organization had a major stake in
addressing the issue, in part because significant efforts to create analytical deliverables
were being wasted due to poor data quality across silos. Reports and dashboards they
produced were frequently rejected or questioned by the business.
On the business side, a lack of reliable data, terminology, and location information related
to substations, breakers, and meters was creating a distorted picture of maintenance and
operational expenses. Equipment failures, unnecessary repairs, fines, and tax liabilities
were impacting the field operations and finance organizations. Duplicate and inconsistent
vendor and material data was exposing the entire supply chain to excess inventory, non-
optimized discounting, and supply risk.
By engaging stakeholders from both IT and business, the MDM project team was able to
surface specific problems related to data quality that each organization was experiencing.
Both groups shared a common interest and motivation to resolve those problems. As a
result, the teams worked closely together to ensure the MDM technology strategy aligned
with the needs of the business.
Prepare for and Conduct Interviews
Conducting extensive interviews with a sampling of individuals involved in each step of a
business process is one of the best ways to uncover hidden problems and opportunities.
Ensure that your list of interviewees is comprehensive. Every cross-organizational job
function must be represented, regardless of an individual’s seniority or role.
Prepare for the interviews by entering your questions in a spreadsheet that can be used
later to analyze and quantify the results. Use separate tabs with a questionnaire for each
individual role. For example, you may have tabs for IT manager, a data administrator, a
customer service representative, a risk and compliance officer, a marketing manager, a
financial analyst, a sales representative, and any number of other relevant roles for your
initiative. Each questionnaire will contain quantitative and qualitative questions. Figure 1
lists several examples.
ACTION ITEM
Explore processes and people.
An effective way to identify
key stakeholders is to start
with an examination of your
current business processes
and IT infrastructure. Explore
how the current state evolved,
what the desired state might
look like, what changes to the
organizational structure might be
needed, and who will
be affected.
7. 5Recipe for C-Level Buy-In to MDM
Figure 1. Organizing Your Interviews
Using a spreadsheet and organizing questions with tabs for different roles allows you to
combine in the same file interview answers with analysis and quantification of results. Our
sample questions below address the IT manager, data administrator, and customer service
roles but also include questions for trade executors, risk management, marketing, finance,
and sales roles.
IT Manager Sample Questions
Software Licenses: Which of the following software do you currently have installed?
1 Data Cleansing
Software Number of
licenses
Applications
that leverage
software
Annual
maintenance
cost
Number
of systems
with
real-time
connectors
Annual
postal
directory
cost
First Logic
Group 1
Platon
Trillium
Other 1 __________
Other 2 __________
What package do you project using for your final solution?
2 Customer Hub
Software Number of
licenses
Applications
that leverage
software
Annual
maintenance
cost
Live?
IBM
Initiate
Siebel
SAP MDM
Other 1 __________
Other 2 __________
3 Hierarchy Management Software
Do you have software in place to manage hierarchies? If so, what are they and which
systems serve as the source?
Infrastructure and Development
4 Disk Consolidation
As part of the scope of the project, would it be possible for you to replace the customer
database in your applications with the common hub? If so, which systems and what is the
size of the customer database in each?
Do your systems support merging of customer records? If so, which systems and what is the
size of your customer database in these systems?
8. 6
5 Development effort
… Ask questions around interfaces in place to support account and individual data across
systems, interfaces planned for development, customizations to standard applications for
data quality, etc.
6 Your Staff
… Ask questions about number of full-time analysts, cost/analyst/year, full-time developers,
time spent training new developers, etc.
7
Integration Projects (number of integrations performed per year? Workdays in typical
integration project? Typical cost of data conversion? Etc.)
8
Improving Performance of Prior Projects (failed projects that didn’t achieve ROI? SWAT
team initiatives in place to improve CRM adoption? Etc.
Data Administrator Sample Questions
Current Effort
1 What efforts are currently in place to cleanse data across systems?
Activity Resources employed Annual cost
Manual cleansing
Record merging
3rd party services
Dun and Bradstreet
Other ___________
2 What efforts are in place to keep data up to date?
Which of the following are current sources of leads and which are sources that could
be used if data could be reconciled with current files?
Campaign type Annual number of records
purchased
Price per record Potential
source
Dun and Bradstreet
Cold Calling
AVOX
Axiom
3rd Party Service
Other ___________
Current Process Overview
3
Describe the process by with master data records are created, updated, and
managed. Include counter-party and individual data.
4
Describe the organizational responsibilities for data management and the personnel
assigned to the task.
Controls (e.g., how is Sarbanes-Oxley compliance met through controls of data? How are you
addressing other requirements?)
9. 7Recipe for C-Level Buy-In to MDM
Customer Support Sample Questions
1 Call handling: How does your company provide support?
Order entry method Resources
employed
Time to
handle
Time to create
customer
Cost per
case
Telephone
Web Self-Service
Other 1 ___________
Other 2 ___________
Total number of accounts in the Customer Service DB ________
Total number of contacts in the Customer Service DB _________
How would you describe the impact of incomplete or inaccurate data on the
customer service experience (incorrect mailings, failure to upsell, bad service,
long calls, etc.)
2
Mailing Charges (What items are mailed to customers from customer service? At
what rate are these returned undelivered? Is your mailing CASS certified? If so,
at what cost? Etc.)
In addition to interviewing different roles, you should also include a tab in your spreadsheet
for questions about the organization as a whole. These can range from broad issues like
your company’s industry, annual revenues, current data assessment and IT topology to more
specific details about key functional areas like sales, marketing, support, and finance. Other
tabs to insert can contain additional input field and calculation formulas that will streamline
your analysis and business case quantification.
To ensure objectivity, candid responses and credibility, it is best to use independent
professionals from outside the organization to conduct the interviews. Internally-driven
interviews can result in hesitancy by interviewees to reveal all information and can also be
colored by internal politics and biases. Regardless of whether you conduct the discovery
yourself or use an independent resource, advise interviewees in advance on the topics
to be discussed so they have an opportunity to think about their responses. The best way
to capture and communicate findings is to record transcripts and select direct quotes that
effectively convey key points. Unfiltered voices of those “on the ground” enhance the
veracity and credibility of a business case.
“We would save so much time and client frustration if our telephony system
could identify a customer and pre-populate our call center screen with all
core account and policy data for this client”
“The synch between our service management system and our ERP system
is automatic, and the parts that are in our ERP system can be overwritten
automatically. As such, we end up with four different part numbers for the
same part. Duplicate part numbers are a huge opportunity”
“I buy one fitting…it is listed as 30 different part numbers in 30 different
warehouses”
ACTION ITEM
Get curious. Interview both
business and IT stakeholders
about what they do, how they do
it, and where they waste time or
encounter problems because of
inconsistent, disparate data, and
how they’d change the process
if they could. Organize your
spreadsheet of questions by role,
prepare your interviewees, and
conduct recorded discussions in
an area free from distraction.
10. 8
Analyze Your Findings and Build the Case
The interviews and research serve as inputs to quantitative analysis. You’ll want to quantify costs and
benefits, test and measure the sensitivity of underlying assumptions, and evaluate alternative scenarios to
build an effective business case. Also, make sure you quantify the costs of not taking action—what are the
consequences of inconsistent master data?
Common elements of a business model include:
• Potential benefits – Improving master data associated with any functional area of the business will usually
carry multiple benefits. These can be categorized as quantifiable cost savings, or hard benefits, and
revenue enhancers or soft benefits that are less tangible and more difficult to objectively quantify. For
example, improving product master data may yield a series of hard benefits that collectively result in better
supplier discounts. Those benefits could include increased discounting for timely payments, increased
agility to negotiate payment terms, elimination of unnecessary systems, improvements to the downstream
validation process, and less time searching for information. Soft benefits may include a better customer
experience which in turn improves customer loyalty and drives higher revenue.
• Metrics/KPIs for each benefit – Your business case must translate quality master data into dollars as well as
outline the opportunity costs of bad master data. For instance, what’s the value of reduced call center costs
as a result of quality data? And how would you quantify lost revenue opportunities from bad master data?
Make sure you define units of measure—e.g., cost per support call, average support call resolution time,
etc.—associated with each benefit.
• A range of projected financial values for each benefit. You’ll want to quantify the projected financial benefits
using a range rather than a single value. These are commonly represented as conservative, optimistic, and
most likely.
The output of your analysis, expressed in financial terms, is ultimately what will be subject to the greatest
scrutiny when the business case is presented to senior management. So make sure that all stakeholders agree
with the model, set of assumptions, and the inputs used for analysis.
Figure 2. Benefit Summary Example
Your analysis should list benefits of implementing MDM with a range of projected metrics for each. The table
below represents a sample of benefits
Conservative Likely Optimistic
Savings in maintenance costs
Reduced IT staff effort
Reduced business effort associated with regulatory
report generation
Reduced compliance costs (fines and audits)
Reduced airfreight expenditures
Reduction of bill of material component cost as a %
of sales
Reduced factory or production unplanned downtime
Total annual benefits
11. 9Recipe for C-Level Buy-In to MDM
Figure 3. Benefit Detail
In addition to summarizing benefit metrics, your analysis should provide detail on each. For instance, describe
the benefit and quantify supporting details, as the example below illustrates. Be transparent with your
information sources and analysis logic.
Benefit # 11: Reduced Losses from Revenue Leakage
Company X reports that significant revenue is lost (leakage) due to data quality issues such as
incorrect billing information, costing due to lack of knowledge and linkage with customers, contracts,
facility maintenance cost, etc. By correcting even a small portion of the reported 8-9% leakage with
more reliable data, a significant financial impact may be realized.
Conservative Likely Optimistic
Total operating revenue ($B) $10.128 $10.128 $10.128
Reported leakage 8% 9% 10%
Lost revenue/year $810,240,000 $911,520,000 $1,012,800,000
Recovered revenue with MDM 1.0% 1.5% 2.0%
Gross margin 18% 18% 18%
Annual associated value $1,457,600 $2,459,700 $3,644,000
The ultimate goal of a business case exercise is to create a presentation (often delivered to a senior
management finance committee) and supporting financial document that will objectively communicate
the return on investment. An effective case must quantify the existing problem as well as the benefits of
implementing MDM. Those benefits can usually be grouped into three categories: increased revenue,
reduced costs, and improved productivity.
Your business case will also include qualitative benefits that may be intuitive but cannot be objectively
quantified. For example, MDM programs inherently reduce an organization’s exposure to compliance
risk by providing better audit reporting capabilities, internal controls, and data governance. Data quality
improvements increase the accuracy of analytic reporting, which in turn lead to smarter decisions. Intangible
benefits of this nature are important to call out but should not take precedence over the more compelling and
objective “hard data” that most executives prefer to support investment decisions.
In structuring your presentation, the following components should be included:
• Executive Summary – A single slide synopsis of the problem, opportunity, solution, and key success metrics
(Return on Investment, Net Present Value, Payback Period, and Annual Financial Benefit)
• Objectives and Approach – Key objectives, participants and roles, project scope (business functions and IT
architecture domains), and major steps in the business case methodology
• Business Challenges – An assessment of current IT and business challenges and associated cost, revenue
and risk drivers
• Observations and Findings – A summary of key findings supported by and illustrated with direct quotes
from interviews
• Business Value Quantification – A detailed financial quantification of benefits over a five-year time horizon
(expressed as a range of conservative, likely, and optimistic scenarios), and categorized by increased
revenue, cost reduction, and productivity improvements; qualitative benefits should be appended here
as well
• Comps – Benchmark KPIs that compare a selection of relevant metrics for your organization to
industry peers
12. 10
• Proposed Solution – Summary view that includes a functional description, graphical architectural
representation, key project assumptions, project milestones and timeline, project phases, and other
relevant information
• Analyst Research – Any relevant third-party research that supports the business case
Show, Don’t Tell
Make your point using the three or four key numbers that quantify the value. Typically these include Return
on Investment, Net Present Value, Payback Period, and Annual Benefit (expressed as cost savings or revenue
increases). Simple charts or graphs like those below add further clarity.
Case Study: Driving MDM Value at AutoTrader.com
Atlanta-based AutoTrader.com had a problem. As an online platform used by auto dealers, consumers,
and manufacturers to advertise vehicles, its homegrown, outmoded MDM system lacked any form of data
governance, resulting in multiple identities for a single customer. One individual dealer, for instance, had 27
different identities associated with it.
The company’s business goal was to target a universe of some 50,000 automotive dealers, manufacturers,
and their associated marketing companies—which it refers to as “customers”—to advertise on its Web site.
A team comprising representatives from IT and the business collaborated in analyzing the situation and
developing a business case for an MDM initiative.
13. 11Recipe for C-Level Buy-In to MDM
The team noted a discrepancy between its 50,000 targeted customers and the 275,000
customers represented in its in-house developed MDM system. Not only were there multiple
identities for a single customer, but relationships between customers were a tangled knot
that no one could understand.
As Scott Salter, senior director of enterprise data and shared services of AutoTrader.
com noted, “The relationship between our customers—dealers, manufacturers, and
their advertising agencies—is tightly interwoven. However, we didn’t understand those
connection points. We couldn’t tell how many Ford or Nissan dealers there were out
there, the interrelationship between each dealer, or sometimes which marketing agency
represented which dealer. In customer sales presentations, too much precious sales time was
wasted simply trying to understand how broad the dealer’s network was.”
In building the business case, AutoTrader focused on quantifying anticipated increases in
revenue as well as cost reductions. According to Salter, “the business case for Informatica
MDM was based on two key criteria: driving increased advertising revenue and cost
avoidance. We measure the impact of Informatica MDM on both of these in multiple
millions of dollars.”
The multi-domain solution AutoTrader implemented is specifically tailored to the unique
needs of the automotive dot-com giant and will facilitate the adoption of other domains
beyond the customer master, such as vehicle inventory and the end-user buyers of the
vehicles. The system has allowed AutoTrader to introduce proactive master data governance
throughout the organization. Business users, data stewards, and IT managers can now
create and monitor master data quickly and easily.
Shortly after going live with the first phase of its MDM project, AutoTrader has already
realized many of the benefits projected in its business case. According to the team at
AutoTrader, the implementation of MDM has:
• Led to an anticipated multimillion-dollar increase in on-line advertising revenues
• Created a reliable, single version of dealers and vehicle manufacturers
• Increased value of data by making it relevant, holistic, and trustworthy
• Reduced cost of data by increasing advertising sales effectiveness
• Lowered time and resources involved in navigating overlapping customer records
• Introduced enterprise-wide, proactive master data governance
• Gained future extensibility through adoption of other relevant MDM domains
MDM best practices: value derived from successful customers
While the basic steps for building a business case for MDM can apply to any organization
in any industry, the implementation of MDM varies widely. Every organization has its
unique data challenges, IT topology, business processes, and master data requirements.
When deploying MDM, it’s important to employ a flexible, multi-functional business
model that can adapt to your organization’s people, processes, data, and technology
infrastructure. It is equally important to recognize that the need for accurate and consistent
master data is rarely confined to a single business function. For example, in most extended
enterprises, customer master data is routinely shared across sales, marketing, finance,
manufacturing, and even suppliers and other business partners. Although many MDM
projects start by addressing the needs of a specific functional area, a multi-domain business
model and scalable technology platform can allow your initial investment to be cost-
effectively applied in other parts of the organization as needs expand.
ACTION ITEM
Quantify the benefit. Your
analysis and interviews should
result in lists of specific hard
and soft business benefits you’ll
realize from investing in MDM,
with conservative, likely, and
optimistic figures for cost savings
or revenue gains. Include
qualitative findings to support the
hard numbers.