This document discusses why Master Data Management (MDM) projects often fail and the implications for big data initiatives. Some key reasons for MDM project failures include a lack of enterprise thinking and executive sponsorship, weak business cases, treating MDM as an IT solution rather than business solution, unrealistic roadmaps, and poor communications planning. The document argues that establishing a data governance strategy, enterprise reference architecture, and prioritized project roadmap are important for MDM and big data success.
The Missing Link in Enterprise Data Governance - Automated Metadata ManagementDATAVERSITY
So many companies and organizations are in the same boat. They’re drowning in their data — so much data, from so many different sources. They understand that data governance is hugely important for them to be able to know their data inside and out and comply with regulations. What many companies have not yet come to terms with when implementing their data governance strategy and supporting tools, is the criticality of metadata in the process. As the ‘data about data,’ metadata provides the value and purpose of the data content, thereby becoming an extremely effective tool for quickly locating information – a must for BI groups dealing with analytics and business user reporting.
Octopai's CEO, Amnon Drori will discuss this critical missing link in enterprise data governance and the impact of automating metadata management for data discovery and data lineage for BI. He'll demonstrate how BI groups use Octopai to not only locate their data instantly, but to quickly and accurately visualize and understand the entire data journey to enable the business to move forward.
The Importance of Master Data ManagementDATAVERSITY
Despite its immaterial nature, data has a tendency to pile up as time goes on, and can quickly be rendered unusable or obsolete without careful maintenance and streamlining of processes for its management. This presentation will provide you with an understanding of reference and Master Data Management (MDM), one such method for keeping mass amounts of business data organized and functional towards achieving business goals.
MDM’s guiding principles include the establishment and implementation of authoritative data sources and effective means of delivering data to various business processes, as well as increases to the quality of information used in organizational analytical functions (such as BI). To that end, attendees of this webinar will learn how to:
Structure their Data Management processes around these principles
Incorporate Data Quality engineering into the planning of reference and MDM
Understand why MDM is so critical to their organization’s overall data strategy
Discuss foundational MDM concepts based on “The DAMA Guide to the Data Management Body of Knowledge” (DAMA DMBOK)
Strategic Business Requirements for Master Data Management SystemsBoris Otto
This presentation describes strategic business requirements of master data management (MDM) systems. The requirements were developed in a consortium research approach by the Institute of Information Management at the University of St. Gallen, Switzerland, and 20 multinational enterprises.
The presentation was given at the 17th Amercias Conference on Information Systems (AMCIS 2011) in Detroit, MI.
The research paper on which this presentation is based on can be found here: http://www.alexandria.unisg.ch/Publikationen/Zitation/Boris_Otto/177697
Master Data Management - Aligning Data, Process, and GovernanceDATAVERSITY
Master Data Management (MDM) can provide significant value to the organization in creating consistent key data assets such as Customer, Product, Supplier, Patient, and the list goes on. But getting MDM “right” requires a strategic mix of Data Architecture, business process, and Data Governance. Join this webinar to learn how to find the “sweet spot” between technology, design, process, and people for your MDM initiative.
The Missing Link in Enterprise Data Governance - Automated Metadata ManagementDATAVERSITY
So many companies and organizations are in the same boat. They’re drowning in their data — so much data, from so many different sources. They understand that data governance is hugely important for them to be able to know their data inside and out and comply with regulations. What many companies have not yet come to terms with when implementing their data governance strategy and supporting tools, is the criticality of metadata in the process. As the ‘data about data,’ metadata provides the value and purpose of the data content, thereby becoming an extremely effective tool for quickly locating information – a must for BI groups dealing with analytics and business user reporting.
Octopai's CEO, Amnon Drori will discuss this critical missing link in enterprise data governance and the impact of automating metadata management for data discovery and data lineage for BI. He'll demonstrate how BI groups use Octopai to not only locate their data instantly, but to quickly and accurately visualize and understand the entire data journey to enable the business to move forward.
The Importance of Master Data ManagementDATAVERSITY
Despite its immaterial nature, data has a tendency to pile up as time goes on, and can quickly be rendered unusable or obsolete without careful maintenance and streamlining of processes for its management. This presentation will provide you with an understanding of reference and Master Data Management (MDM), one such method for keeping mass amounts of business data organized and functional towards achieving business goals.
MDM’s guiding principles include the establishment and implementation of authoritative data sources and effective means of delivering data to various business processes, as well as increases to the quality of information used in organizational analytical functions (such as BI). To that end, attendees of this webinar will learn how to:
Structure their Data Management processes around these principles
Incorporate Data Quality engineering into the planning of reference and MDM
Understand why MDM is so critical to their organization’s overall data strategy
Discuss foundational MDM concepts based on “The DAMA Guide to the Data Management Body of Knowledge” (DAMA DMBOK)
Strategic Business Requirements for Master Data Management SystemsBoris Otto
This presentation describes strategic business requirements of master data management (MDM) systems. The requirements were developed in a consortium research approach by the Institute of Information Management at the University of St. Gallen, Switzerland, and 20 multinational enterprises.
The presentation was given at the 17th Amercias Conference on Information Systems (AMCIS 2011) in Detroit, MI.
The research paper on which this presentation is based on can be found here: http://www.alexandria.unisg.ch/Publikationen/Zitation/Boris_Otto/177697
Master Data Management - Aligning Data, Process, and GovernanceDATAVERSITY
Master Data Management (MDM) can provide significant value to the organization in creating consistent key data assets such as Customer, Product, Supplier, Patient, and the list goes on. But getting MDM “right” requires a strategic mix of Data Architecture, business process, and Data Governance. Join this webinar to learn how to find the “sweet spot” between technology, design, process, and people for your MDM initiative.
What has changed in DMBok V2?
We have been working with DMBoK V1 for may years and it is great to finally get to read and study the changes. Did a quikc comparison between the 2 versions.
Five Things to Consider About Data Mesh and Data GovernanceDATAVERSITY
Data mesh was among the most discussed and controversial enterprise data management topics of 2021. One of the reasons people struggle with data mesh concepts is we still have a lot of open questions that we are not thinking about:
Are you thinking beyond analytics? Are you thinking about all possible stakeholders? Are you thinking about how to be agile? Are you thinking about standardization and policies? Are you thinking about organizational structures and roles?
Join data.world VP of Product Tim Gasper and Principal Scientist Juan Sequeda for an honest, no-bs discussion about data mesh and its role in data governance.
Metadata management is critical for organizations looking to understand the context, definition and lineage of key data assets. Data models play a key role in metadata management, as many of the key structural and business definitions are stored within the models themselves. Can data models replace traditional metadata solutions? Or should they integrate with larger metadata management tools & initiatives?
Join this webinar to discuss opportunities and challenges around:
How data modeling fits within a larger metadata management landscape
When can data modeling provide “just enough” metadata management
Key data modeling artifacts for metadata
Organization, Roles & Implementation Considerations
Tackling Data Quality problems requires more than a series of tactical, one-off improvement projects. By their nature, many Data Quality problems extend across and often beyond an organization. Addressing these issues requires a holistic architectural approach combining people, process, and technology. Join Nigel Turner and Donna Burbank as they provide practical ways to control Data Quality issues in your organization.
Gartner: Master Data Management FunctionalityGartner
Gartner will further examine key trends shaping the future MDM market during the Gartner MDM Summit 2011, 2-3 February in London. More information at www.europe.gartner.com/mdm
Overcoming the Challenges of your Master Data Management JourneyJean-Michel Franco
This Presentaion runs you through all the key steps of an MDM initiative. It considers and showcase the key milestones and building blocks that you will have to roll-out to make your MDM
journey
-> Please contact Talend for a dedicated interactive sessions with a storyboard by customer domain
Requirements for a Master Data Management (MDM) Solution - PresentationVicki McCracken
Working on Requirements for a Master Data Management solution and looking for thoughts on how to approach the requirements? This is an overview presentation that complements my guide on how to approach requirements for a Master Data Management solution (Requirements for an MDM Solution). You may be able to leverage all or some of the approach described in this guide to formulate your approach.
Data modelling is considered a staple in the world of data management. The skill of the data modeler and their knowledge of the business plays a large role in successful Enterprise Information Management across many organizations. Data modeling requires formal accountability, attention to metadata and getting the business heavily involved in data requirement development. These are all traits of solid Data Governance programs.
Join Bob Seiner and a special guest modeler extraordinaire in this month’s installment of Real-World Data Governance to discuss data modeling as a form of data governance. Learn how to use the skillfulness of the data modeler to advance data-as-an-asset and governance agendas while conveying the importance and value of both disciplines.
In this webinar Bob and a special guest will talk about:
•Data Modeling as Art or Science
•Role of Data Modeler in a Governance Program
•Data Modeler Skills as Governance Skills
•Modeling and Governance Best Practices
•Leveraging the Model as a Governance Artifact
Master Data Management – Aligning Data, Process, and GovernanceDATAVERSITY
Master Data Management (MDM) provides organizations with an accurate and comprehensive view of their business-critical data such as customers, products, vendors, and more. While mastering these key data areas can be a complex task, the value of doing so can be tremendous – from real-time operational integration to data warehousing and analytic reporting. This webinar will provide practical strategies for gaining value from your MDM initiative, while at the same time assuring a solid architectural and governance foundation that will ensure long-term, enterprise-wide success.
Webinar future dataintegration-datamesh-and-goldengatekafkaJeffrey T. Pollock
The Future of Data Integration: Data Mesh, and a Special Deep Dive into Stream Processing with GoldenGate, Apache Kafka and Apache Spark. This video is a replay of a Live Webinar hosted on 03/19/2020.
Join us for a timely 45min webinar to see our take on the future of Data Integration. As the global industry shift towards the “Fourth Industrial Revolution” continues, outmoded styles of centralized batch processing and ETL tooling continue to be replaced by realtime, streaming, microservices and distributed data architecture patterns.
This webinar will start with a brief look at the macro-trends happening around distributed data management and how that affects Data Integration. Next, we’ll discuss the event-driven integrations provided by GoldenGate Big Data, and continue with a deep-dive into some essential patterns we see when replicating Database change events into Apache Kafka. In this deep-dive we will explain how to effectively deal with issues like Transaction Consistency, Table/Topic Mappings, managing the DB Change Stream, and various Deployment Topologies to consider. Finally, we’ll wrap up with a brief look into how Stream Processing will help to empower modern Data Integration by supplying realtime data transformations, time-series analytics, and embedded Machine Learning from within data pipelines.
GoldenGate: https://www.oracle.com/middleware/tec...
Webinar Speaker: Jeff Pollock, VP Product (https://www.linkedin.com/in/jtpollock/)
Data Mesh is a new socio-technical approach to data architecture, first described by Zhamak Dehghani and popularised through a guest blog post on Martin Fowler's site.
Since then, community interest has grown, due to Data Mesh's ability to explain and address the frustrations that many organisations are experiencing as they try to get value from their data. The 2022 publication of Zhamak's book on Data Mesh further provoked conversation, as have the growing number of experience reports from companies that have put Data Mesh into practice.
So what's all the fuss about?
On one hand, Data Mesh is a new approach in the field of big data. On the other hand, Data Mesh is application of the lessons we have learned from domain-driven design and microservices to a data context.
In this talk, Chris and Pablo will explain how Data Mesh relates to current thinking in software architecture and the historical development of data architecture philosophies. They will outline what benefits Data Mesh brings, what trade-offs it comes with and when organisations should and should not consider adopting it.
Data Catalog for Better Data Discovery and GovernanceDenodo
Watch full webinar here: https://buff.ly/2Vq9FR0
Data catalogs are en vogue answering critical data governance questions like “Where all does my data reside?” “What other entities are associated with my data?” “What are the definitions of the data fields?” and “Who accesses the data?” Data catalogs maintain the necessary business metadata to answer these questions and many more. But that’s not enough. For it to be useful, data catalogs need to deliver these answers to the business users right within the applications they use.
In this session, you will learn:
*How data catalogs enable enterprise-wide data governance regimes
*What key capability requirements should you expect in data catalogs
*How data virtualization combines dynamic data catalogs with delivery
Gartner: Seven Building Blocks of Master Data ManagementGartner
Gartner will further examine key trends shaping the future MDM market during the Gartner MDM Summit 2011, 2-3 February in London. More information at www.europe.gartner.com/mdm.
It’s been three years since the General Data Protection Regulation shook up how organizations manage data security and privacy, ushering in a new focus on Data Governance. But what is the state of Data Governance today?
How has it evolved? What’s its role now? Building on prior research, erwin by Quest and ESG have partnered on a new study about what’s driving the practice of Data Governance, program maturity and current challenges. It also examines the connections to data operations and data protection, which is interesting given the fact that improving data security is now the No. 1 driver of Data Governance, according to this year’s survey respondents.
So please join us for this webinar to learn about the:
Other primary drivers for enterprise Data Governance programs
Most common bottlenecks to program maturity and sustainability
Advantages of aligning Data Governance with the other data disciplines
In a post-COVID world, data has the power to be even more transformative, and 84% of business and technology professionals say it represents the best opportunity to develop a competitive advantage during the next 12 to 24 months. Let’s make sure your organization has the intelligence it needs about both data and data systems to empower stakeholders in the front and back office to do what they need to do.
A Work of Zhamak Dehghani
Principal consultant
ThoughtWorks
https://martinfowler.com/articles/data-monolith-to-mesh.html
https://fast.wistia.net/embed/iframe/vys2juvzc3?videoFoam
How to Move Beyond a Monolithic Data Lake to a Distributed Data Mesh
Many enterprises are investing in their next generation data lake, with the hope of democratizing data at scale to provide business insights and ultimately make automated intelligent decisions. Data platforms based on the data lake architecture have common failure modes that lead to unfulfilled promises at scale. To address these failure modes we need to shift from the centralized paradigm of a lake, or its predecessor data warehouse. We need to shift to a paradigm that draws from modern distributed architecture: considering domains as the first class concern, applying platform thinking to create self-serve data infrastructure, and treating data as a product.
Data Modeling, Data Governance, & Data QualityDATAVERSITY
Data Governance is often referred to as the people, processes, and policies around data and information, and these aspects are critical to the success of any data governance implementation. But just as critical is the technical infrastructure that supports the diverse data environments that run the business. Data models can be the critical link between business definitions and rules and the technical data systems that support them. Without the valuable metadata these models provide, data governance often lacks the “teeth” to be applied in operational and reporting systems.
Join Donna Burbank and her guest, Nigel Turner, as they discuss how data models & metadata-driven data governance can be applied in your organization in order to achieve improved data quality.
Data governance Program PowerPoint Presentation Slides SlideTeam
Presenting this set of slides with name - Data Governance Program Powerpoint Presentation Slides. Our creatively crafted slides come with apt research and planning. This exclusive deck with twenty-five slides is here to help you to strategize, plan, analyze, or segment the topic with clear understanding and apprehension. Utilize ready to use presentation slides on Data Governance Program Powerpoint Presentation Slides with all sorts of editable templates, charts and graphs, overviews, analysis templates. PPT slides are accessible in both widescreen and standard format. PowerPoint templates are compatible with Google Slides. Quick and risk-free downloading process. It can be easily converted into JPG or PDF format
What has changed in DMBok V2?
We have been working with DMBoK V1 for may years and it is great to finally get to read and study the changes. Did a quikc comparison between the 2 versions.
Five Things to Consider About Data Mesh and Data GovernanceDATAVERSITY
Data mesh was among the most discussed and controversial enterprise data management topics of 2021. One of the reasons people struggle with data mesh concepts is we still have a lot of open questions that we are not thinking about:
Are you thinking beyond analytics? Are you thinking about all possible stakeholders? Are you thinking about how to be agile? Are you thinking about standardization and policies? Are you thinking about organizational structures and roles?
Join data.world VP of Product Tim Gasper and Principal Scientist Juan Sequeda for an honest, no-bs discussion about data mesh and its role in data governance.
Metadata management is critical for organizations looking to understand the context, definition and lineage of key data assets. Data models play a key role in metadata management, as many of the key structural and business definitions are stored within the models themselves. Can data models replace traditional metadata solutions? Or should they integrate with larger metadata management tools & initiatives?
Join this webinar to discuss opportunities and challenges around:
How data modeling fits within a larger metadata management landscape
When can data modeling provide “just enough” metadata management
Key data modeling artifacts for metadata
Organization, Roles & Implementation Considerations
Tackling Data Quality problems requires more than a series of tactical, one-off improvement projects. By their nature, many Data Quality problems extend across and often beyond an organization. Addressing these issues requires a holistic architectural approach combining people, process, and technology. Join Nigel Turner and Donna Burbank as they provide practical ways to control Data Quality issues in your organization.
Gartner: Master Data Management FunctionalityGartner
Gartner will further examine key trends shaping the future MDM market during the Gartner MDM Summit 2011, 2-3 February in London. More information at www.europe.gartner.com/mdm
Overcoming the Challenges of your Master Data Management JourneyJean-Michel Franco
This Presentaion runs you through all the key steps of an MDM initiative. It considers and showcase the key milestones and building blocks that you will have to roll-out to make your MDM
journey
-> Please contact Talend for a dedicated interactive sessions with a storyboard by customer domain
Requirements for a Master Data Management (MDM) Solution - PresentationVicki McCracken
Working on Requirements for a Master Data Management solution and looking for thoughts on how to approach the requirements? This is an overview presentation that complements my guide on how to approach requirements for a Master Data Management solution (Requirements for an MDM Solution). You may be able to leverage all or some of the approach described in this guide to formulate your approach.
Data modelling is considered a staple in the world of data management. The skill of the data modeler and their knowledge of the business plays a large role in successful Enterprise Information Management across many organizations. Data modeling requires formal accountability, attention to metadata and getting the business heavily involved in data requirement development. These are all traits of solid Data Governance programs.
Join Bob Seiner and a special guest modeler extraordinaire in this month’s installment of Real-World Data Governance to discuss data modeling as a form of data governance. Learn how to use the skillfulness of the data modeler to advance data-as-an-asset and governance agendas while conveying the importance and value of both disciplines.
In this webinar Bob and a special guest will talk about:
•Data Modeling as Art or Science
•Role of Data Modeler in a Governance Program
•Data Modeler Skills as Governance Skills
•Modeling and Governance Best Practices
•Leveraging the Model as a Governance Artifact
Master Data Management – Aligning Data, Process, and GovernanceDATAVERSITY
Master Data Management (MDM) provides organizations with an accurate and comprehensive view of their business-critical data such as customers, products, vendors, and more. While mastering these key data areas can be a complex task, the value of doing so can be tremendous – from real-time operational integration to data warehousing and analytic reporting. This webinar will provide practical strategies for gaining value from your MDM initiative, while at the same time assuring a solid architectural and governance foundation that will ensure long-term, enterprise-wide success.
Webinar future dataintegration-datamesh-and-goldengatekafkaJeffrey T. Pollock
The Future of Data Integration: Data Mesh, and a Special Deep Dive into Stream Processing with GoldenGate, Apache Kafka and Apache Spark. This video is a replay of a Live Webinar hosted on 03/19/2020.
Join us for a timely 45min webinar to see our take on the future of Data Integration. As the global industry shift towards the “Fourth Industrial Revolution” continues, outmoded styles of centralized batch processing and ETL tooling continue to be replaced by realtime, streaming, microservices and distributed data architecture patterns.
This webinar will start with a brief look at the macro-trends happening around distributed data management and how that affects Data Integration. Next, we’ll discuss the event-driven integrations provided by GoldenGate Big Data, and continue with a deep-dive into some essential patterns we see when replicating Database change events into Apache Kafka. In this deep-dive we will explain how to effectively deal with issues like Transaction Consistency, Table/Topic Mappings, managing the DB Change Stream, and various Deployment Topologies to consider. Finally, we’ll wrap up with a brief look into how Stream Processing will help to empower modern Data Integration by supplying realtime data transformations, time-series analytics, and embedded Machine Learning from within data pipelines.
GoldenGate: https://www.oracle.com/middleware/tec...
Webinar Speaker: Jeff Pollock, VP Product (https://www.linkedin.com/in/jtpollock/)
Data Mesh is a new socio-technical approach to data architecture, first described by Zhamak Dehghani and popularised through a guest blog post on Martin Fowler's site.
Since then, community interest has grown, due to Data Mesh's ability to explain and address the frustrations that many organisations are experiencing as they try to get value from their data. The 2022 publication of Zhamak's book on Data Mesh further provoked conversation, as have the growing number of experience reports from companies that have put Data Mesh into practice.
So what's all the fuss about?
On one hand, Data Mesh is a new approach in the field of big data. On the other hand, Data Mesh is application of the lessons we have learned from domain-driven design and microservices to a data context.
In this talk, Chris and Pablo will explain how Data Mesh relates to current thinking in software architecture and the historical development of data architecture philosophies. They will outline what benefits Data Mesh brings, what trade-offs it comes with and when organisations should and should not consider adopting it.
Data Catalog for Better Data Discovery and GovernanceDenodo
Watch full webinar here: https://buff.ly/2Vq9FR0
Data catalogs are en vogue answering critical data governance questions like “Where all does my data reside?” “What other entities are associated with my data?” “What are the definitions of the data fields?” and “Who accesses the data?” Data catalogs maintain the necessary business metadata to answer these questions and many more. But that’s not enough. For it to be useful, data catalogs need to deliver these answers to the business users right within the applications they use.
In this session, you will learn:
*How data catalogs enable enterprise-wide data governance regimes
*What key capability requirements should you expect in data catalogs
*How data virtualization combines dynamic data catalogs with delivery
Gartner: Seven Building Blocks of Master Data ManagementGartner
Gartner will further examine key trends shaping the future MDM market during the Gartner MDM Summit 2011, 2-3 February in London. More information at www.europe.gartner.com/mdm.
It’s been three years since the General Data Protection Regulation shook up how organizations manage data security and privacy, ushering in a new focus on Data Governance. But what is the state of Data Governance today?
How has it evolved? What’s its role now? Building on prior research, erwin by Quest and ESG have partnered on a new study about what’s driving the practice of Data Governance, program maturity and current challenges. It also examines the connections to data operations and data protection, which is interesting given the fact that improving data security is now the No. 1 driver of Data Governance, according to this year’s survey respondents.
So please join us for this webinar to learn about the:
Other primary drivers for enterprise Data Governance programs
Most common bottlenecks to program maturity and sustainability
Advantages of aligning Data Governance with the other data disciplines
In a post-COVID world, data has the power to be even more transformative, and 84% of business and technology professionals say it represents the best opportunity to develop a competitive advantage during the next 12 to 24 months. Let’s make sure your organization has the intelligence it needs about both data and data systems to empower stakeholders in the front and back office to do what they need to do.
A Work of Zhamak Dehghani
Principal consultant
ThoughtWorks
https://martinfowler.com/articles/data-monolith-to-mesh.html
https://fast.wistia.net/embed/iframe/vys2juvzc3?videoFoam
How to Move Beyond a Monolithic Data Lake to a Distributed Data Mesh
Many enterprises are investing in their next generation data lake, with the hope of democratizing data at scale to provide business insights and ultimately make automated intelligent decisions. Data platforms based on the data lake architecture have common failure modes that lead to unfulfilled promises at scale. To address these failure modes we need to shift from the centralized paradigm of a lake, or its predecessor data warehouse. We need to shift to a paradigm that draws from modern distributed architecture: considering domains as the first class concern, applying platform thinking to create self-serve data infrastructure, and treating data as a product.
Data Modeling, Data Governance, & Data QualityDATAVERSITY
Data Governance is often referred to as the people, processes, and policies around data and information, and these aspects are critical to the success of any data governance implementation. But just as critical is the technical infrastructure that supports the diverse data environments that run the business. Data models can be the critical link between business definitions and rules and the technical data systems that support them. Without the valuable metadata these models provide, data governance often lacks the “teeth” to be applied in operational and reporting systems.
Join Donna Burbank and her guest, Nigel Turner, as they discuss how data models & metadata-driven data governance can be applied in your organization in order to achieve improved data quality.
Data governance Program PowerPoint Presentation Slides SlideTeam
Presenting this set of slides with name - Data Governance Program Powerpoint Presentation Slides. Our creatively crafted slides come with apt research and planning. This exclusive deck with twenty-five slides is here to help you to strategize, plan, analyze, or segment the topic with clear understanding and apprehension. Utilize ready to use presentation slides on Data Governance Program Powerpoint Presentation Slides with all sorts of editable templates, charts and graphs, overviews, analysis templates. PPT slides are accessible in both widescreen and standard format. PowerPoint templates are compatible with Google Slides. Quick and risk-free downloading process. It can be easily converted into JPG or PDF format
The business models across industries around the world are becoming Customer Centric. Recent studies show that “knowing” customers based on internal as well as external data is one of the top priorities of business leaders. On the other hand various surveys also reveal that customers do not mind to share their semi-personal data for the benefit of differentiated service. In that context, the 360 degree view of customer – which was once thought to be a business process, master data management, data integration and data warehouse / business intelligence related problem has now entered into the whole new big world of BIG data including integration with unstructured data sources. Impact of big data on Customer Master Data Management is spread across - from Integration and linkage of unstructured or semi-structured data with structured master data that is maintained within enterprise; to analyze and visualization of the same to generate useful insight about the customers. There are various patterns to handle the challenges across the steps i.e. acquire, link, manage, analyze and distribute the enhanced customer data for differentiated product or services.
Enterprise Information Management Strategy - a proven approachSam Thomsett
Access a proven approach to Enterprise Information Management Strategy - providing a framework for Digital Transformation - by a leader in Information Management Consulting - Entity Group
The DMP 101 - Data Management Platforms ExplainedEddy Widerker
Learn more about what a DMP is, how it works, and why it is crucial in today's ad-tech space. Examples on how a DMP could benefit a brand or a publisher are included at the end.
Webinar: Initiating a Customer MDM/Data Governance ProgramDATAVERSITY
Mastering your customer data is on the critical path for any business undertaking the transformation to a data centric approach. Whether it is to enable effective CRM to enhance day to day operations or leverage in depth customer analytics for strategic planning, understanding your customer data is the foundation of truly understanding and responding to your customer. The first step in mastering your customer data is to discover and document the existing data landscape.
In this session we will present a case study to uncover the drivers, challenges and benefits of mastering your customer data and detail how a customer data discovery pilot, using erwin modeling can underpin and accelerate this initiative, reduce the associated costs and provide a facility to enable ongoing analysis, stakeholder awareness and mitigate the risks involve in re-engineering your customer data management approach.
How to identify the correct Master Data subject areas & tooling for your MDM...Christopher Bradley
1. What are the different Master Data Management (MDM) architectures?
2. How can you identify the correct Master Data subject areas & tooling for your MDM initiative?
3. A reference architecture for MDM.
4. Selection criteria for MDM tooling.
chris.bradley@dmadvisors.co.uk
Magenta advisory: Data Driven Decision Making –Is Your Organization Ready Fo...BearingPoint Finland
It’s nice to have loads of data. Nevertheless, many managers start to sweat when it comes to genuinely fact-based decision making. This study reveals the keys to leveraging big data successfully.
Master Data Management (MDM) for Mid-MarketVivek Mishra
Over the years, MDM has catered to the data requirements of big players or large organizations. But in last few years due to its benefits, small and medium businesses (SMBs) are moving towards MDM to organize, categorize, and localize their master data according to the scale of operations and business processes.
Checkout our whitepaper to learn how master data management (MDM) can help you refine your data for optimal data usage. Discover the essential features of a successful MDM software, and explore how MDM can help improve your organization’s data quality, business insight, and more.
Here are some tried-and-true recommendations for creating a proof of concept for master data management that will power your organization through the most common mistakes and challenges.
MDM and Social Big Data: An Impact AnalysisCognizant
By combining social big data with master data management, businesses can develop personalized products and services, anticipate customer needs and gain competitive advantage.
The article is intended as a quick overview of what effective master data management means in today’s business context in terms of risks, challenges and opportunities for companies and decision makers. The article is structured in two main areas, which cover in turn the importance of an effective master data
management implementation and the methodology to get there.
Big Data for Marketing: When is Big Data the right choice?Swyx
Chief Marketing Officers (CMOs) without plans for Big Data may be putting themselves and
their companies at a competitive disadvantage. Big Data is already being widely deployed to enhance marketing responsibilities, although the small number of widely-touted success stories might be masking a significant number of failed implementations. When correctly planned and implemented, however, Big Data can create significant value for CMOs and their organisations. In this paper, we focus on describing specific examples of how Big Data can support CMO responsibilities and developing frameworks for identifying Big Data opportunities.
Big Data for Marketing: When is Big Data the right choice?Swyx
Chief Marketing Officers (CMOs) without plans for Big Data may be putting themselves and
their companies at a competitive disadvantage. Big Data is already being widely deployed to enhance marketing responsibilities, although the small number of widely-touted success stories might be masking a significant number of failed implementations. When correctly planned and implemented, however, Big Data can create significant value for CMOs and their organisations. In this paper, we focus on describing specific examples of how Big Data can support CMO responsibilities and developing frameworks for identifying Big Data opportunities.
Enterprise-Level Preparation for Master Data Management.pdfAmeliaWong21
Master Data Management (MDM) continues to play a foundational role in the Data Management Architecture of every 21st century enterprise. In a forward-looking organization, MDM is significant in the Enterprise Integration Hub.
Marketing & SalesBig Data, Analytics, and the Future of .docxalfredacavx97
Marketing & Sales
Big Data, Analytics,
and the Future of
Marketing & Sales
March 2015
3McKinseyonMarketingandSales.com @McK_MktgSales
Table of contents
Business
Opportunities
Insight and
action
How to get
organized and
get started
8 Getting big impact from big
data
16 Big Data & advanced
analytics: Success stories
from the front lines
20 Use Big Data to find
new micromarkets
24 Smart analytics: How
marketing drives short-term
and long-term growth
30 Putting Big Data and
advanced analytics to work
34 Know your customers
wherever they are
38 Using marketing analytics to
drive superior growth
48 How leading retailers turn
insights into profits
56 Five steps to squeeze more
ROI from your marketing
60 Using Big Data to make
better pricing decisions
60 Marketing’s age of relevance 72 Gilt Groupe: Using Big Data,
mobile, and social media to
reinvent shopping
76 Under the retail microscope:
Seeing your customers for
the first time
80 Name your price: The power
of Big Data and analytics
84 Getting beyond the buzz: Is
your social media working?
90 How to get the most from big
data
94 Five Roles You Need on Your
Big Data Team
98 Want big data sales programs
to work? Get emotional
102 Get started with Big Data:
Tie strategy to performance
106 What you need to make Big
Data work: The pencil
110 Need for speed: Algorithmic
marketing and customer
data overload
114 Simplify Big Data – or it’ll be
useless for sales
54 McKinseyonMarketingandSales.com @McK_MktgSales
Introduction
Big Data is the biggest hame-changing opportunity for marketing and sales
since the Internet went mainstream almost 20 years ago. The data big bang
has unleashed torrents of terabytes about everything from customer behaviors
to weather patterns to demographic consumer shifts in emerging markets.
The companies who are successful in turning data into above-market growth
will excel at three things:
ƒ Using analytics to identify valuable business opportunities from the data to
drive decisions and improve marketing return on investment (MROI)
ƒ Turning those insights into well-designed products and offers that delight
customers
ƒ Delivering those products and offers effectively to the marketplace.
This goldmine of data represents a pivot-point moment for marketing and
sales leaders. Companies that inject big data and analytics into their operation
show productivity rates and profitability that are 5 percent to 6 percent hight
than those of their peers. That’s an advantage no company can afford to
gnome.
This compendium explores the business opportunities, company examples,
and organizational implications of Big Data and advanced analytics. We hope
it provokes good and useful conversations.
Please contact us with your reactions and thoughts.
David Court
Director
David headed McKinsey’s
functional practices, and
currently leads the firm’s digital
in.
The Trusted Path That Driven Big Data to Successankitbhandari32
The four D.A.T.A. questions formulated by Carsten Lund Pedersen & Thomas Ritter for big data are the following: Data, Autonomy, Technology & Accountability.
Master Data Management's Place in the Data Governance Landscape CCG
For many organizations, Master Data Management is a necessity to ensure consistency and accuracy of essential business entities. It further plays alongside data architecture, metadata management, data quality, security & privacy, and program management in the Data Governance ecosystem.
Join CCG's data governance subject matter experts as they overview the fundamentals of Master Data Management at our Atlanta-based Data Analytics Meetup. This event will discuss how to enable components of data governance within your organization and review how to best leverage Microsoft's SQL Server Master Data Services.
Organizational Change Management: A Make or Break Capability for Digital SuccessCognizant
To realize the full benefits of digital transformation programs, businesses must manage the impact of digital change on their operational structure, culture and employees.
MDM It’s not just about a new concept, it’s about bringing real value to the ...Ismail Vurel
Data is at the heart of nearly any business. Being able to rely on the consistency and reliability of data is key to improving performance and reducing costs in numerous areas, from
sales and marketing to manufacturing and the supply chain. In an era where many consider data to be the basic fuel for innovation and growth, many organisations still struggle to use their data to their best advantage.
We conducted a groundbreaking survey of the UK’s data and business professionals to get a snapshot of the state of the world of data, uncover some of the issues facing the industry and get a sense of the changes on the horizon. The results were enlightening, and in some cases, very surprising.
Find out:
Why nearly a third of IT Directors feel their organisation uses data poorly
What the hybrid data manager of the future will look like
Why understanding customer behaviour remains the holy grail for so many
We conducted a ground-breaking survey of the UK’s data and business professionals to get a snapshot of the state of the world of data, uncover some of the issues facing the industry and get a sense of the changes on the horizon. The results were enlightening, and in some cases, very surprising.
Similar to Why Master Data Management Projects Fail and what this means for Big Data (20)
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AIVladimir Iglovikov, Ph.D.
Presented by Vladimir Iglovikov:
- https://www.linkedin.com/in/iglovikov/
- https://x.com/viglovikov
- https://www.instagram.com/ternaus/
This presentation delves into the journey of Albumentations.ai, a highly successful open-source library for data augmentation.
Created out of a necessity for superior performance in Kaggle competitions, Albumentations has grown to become a widely used tool among data scientists and machine learning practitioners.
This case study covers various aspects, including:
People: The contributors and community that have supported Albumentations.
Metrics: The success indicators such as downloads, daily active users, GitHub stars, and financial contributions.
Challenges: The hurdles in monetizing open-source projects and measuring user engagement.
Development Practices: Best practices for creating, maintaining, and scaling open-source libraries, including code hygiene, CI/CD, and fast iteration.
Community Building: Strategies for making adoption easy, iterating quickly, and fostering a vibrant, engaged community.
Marketing: Both online and offline marketing tactics, focusing on real, impactful interactions and collaborations.
Mental Health: Maintaining balance and not feeling pressured by user demands.
Key insights include the importance of automation, making the adoption process seamless, and leveraging offline interactions for marketing. The presentation also emphasizes the need for continuous small improvements and building a friendly, inclusive community that contributes to the project's growth.
Vladimir Iglovikov brings his extensive experience as a Kaggle Grandmaster, ex-Staff ML Engineer at Lyft, sharing valuable lessons and practical advice for anyone looking to enhance the adoption of their open-source projects.
Explore more about Albumentations and join the community at:
GitHub: https://github.com/albumentations-team/albumentations
Website: https://albumentations.ai/
LinkedIn: https://www.linkedin.com/company/100504475
Twitter: https://x.com/albumentations
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
GridMate - End to end testing is a critical piece to ensure quality and avoid...ThomasParaiso2
End to end testing is a critical piece to ensure quality and avoid regressions. In this session, we share our journey building an E2E testing pipeline for GridMate components (LWC and Aura) using Cypress, JSForce, FakerJS…
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex ProofsAlex Pruden
This paper presents Reef, a system for generating publicly verifiable succinct non-interactive zero-knowledge proofs that a committed document matches or does not match a regular expression. We describe applications such as proving the strength of passwords, the provenance of email despite redactions, the validity of oblivious DNS queries, and the existence of mutations in DNA. Reef supports the Perl Compatible Regular Expression syntax, including wildcards, alternation, ranges, capture groups, Kleene star, negations, and lookarounds. Reef introduces a new type of automata, Skipping Alternating Finite Automata (SAFA), that skips irrelevant parts of a document when producing proofs without undermining soundness, and instantiates SAFA with a lookup argument. Our experimental evaluation confirms that Reef can generate proofs for documents with 32M characters; the proofs are small and cheap to verify (under a second).
Paper: https://eprint.iacr.org/2023/1886
Building RAG with self-deployed Milvus vector database and Snowpark Container...Zilliz
This talk will give hands-on advice on building RAG applications with an open-source Milvus database deployed as a docker container. We will also introduce the integration of Milvus with Snowpark Container Services.
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
2. ENTITY WHITE PAPER WHY MDM PROJECTS FAIL AND WHAT THIS MEANS FOR BIG DATA
INTRODUCTION
There’s no doubt about it – the data universe is expanding at
a dramatic rate. Big data will affect every company, regardless
of size. Big data presents both an enormous challenge and an
enormous opportunity to those companies intent on extracting
value from their information.
According to IDC’s Digital Universe study, the digital universe will double approximately
every two years between 2012 and 2020. This is an intimidating prospect, considering that
80% of all data currently in the digital universe was originated in the last 2 years alone.
Gartner predicts that enterprise data will grow 8 fold in 5 years and that 80% of it will be
unstructured; while structured data continues to grow at a Compound Annual Growth Rate
(CAGR) of 20%.
Furthermore, IDC suggests that only 0.5% of the digital universe is currently analysed;
competitive advantage awaits those companies that succeed in mastering, analysing and
governing their information.
The convergence of several key industry factors is influencing the origination of this data:
the cost of information storage is reducing; mass market adoption of mobile technologies
(smartphones, tablets) means their users are generating lots of unstructured data; machine
generated data is on the rise; cloud adoption is increasing for both business and personal
use; and virtualisation is becoming commonplace within IT architectures.
If organisations are intent on extracting significant value from their data, then they must
first build the foundations for treating data as an enterprise asset.
Big data initiatives run the risk of failure because the foundations of information
management including a consistent enterprise reference data architecture, reference data
management, master data management (MDM) and information lifecycle management
are not in place. In each case organisations are attempting to gain insight and value from
information; Big Data is a larger, scarier version of the same problem.
In light of the fact that 80% of the world’s data was created in the last two years, it is
reasonable to ask whether organisations have progressed dramatically in managing data
in this time, whether they are gaining significant insight from their own internal enterprise
data, and whether they are ready for exponentially increasing volumes of data? Bluntly, in
each case, the answer is no.
Organisations are, however, starting to put their houses in order in preparation for Big Data.
The reasons are clear - if an organisation can truly learn to govern its data across the
enterprise, if it can master information, gain insight and distribute that insight back across
the enterprise to create value, then its people, processes and technology will be better
placed to derive significant value and competitive advantage from Big Data. If it cannot;
it will not.
2
WHY MDM PROJECTS FAIL
AND WHAT THIS MEANS
FOR BIG DATA
3. Data governance, information management strategy, master data management, reference
data management and information lifecycle management, therefore take on greater
importance in preparing the enterprise for Big Data.
Given the potential benefits of getting information management projects right, it is
surprising that only 24 percent of 192 large organisations surveyed in 2011 about data
quality (by analyst firm The Information Difference) described their MDM projects as
“successful or better.” Evidently, a number of MDM programmes are failing to deliver
expected outcomes.
These statistics lead us to ask why MDM projects fail, and what organisations can learn
from their MDM projects for the Big Data challenges ahead? The probability of failure of
MDM projects increases because of a number of factors:
ENTERPRISE THINKING
By its very nature, an MDM initiative requires integration of the information from
different divisions, departments and systems across the enterprise. This involves each of
those divisional and department heads and the system owners subscribing to a single
corporate vision. In many organisations, the MDM initiative is the very first time that the
entire enterprise has to act together to achieve a common goal. It is often very difficult
for this group of people, each with their own parochial interests at heart, to agree on a
common objective and the roadmap to the wealth of benefits that can be achieved.
The realities of business mean that quite often data is defined at the business unit level, in
separate businesses prior to a merger, or at product level. This results in siloed information
strategies, siloed solutions and siloed data. While it is true that nobody starts from a green
field when looking at their data from an enterprise perspective, an effort must be made
when defining an MDM strategy to understand the viewpoints and needs of all of the key
stakeholders of business systems. Business owners will have their own projects, their own
resources and their own budgets that will colour their perspective.
In TDWI’s report on Next Generation MDM, 25% of 219 respondents had more than 10
definitions of customer (while a further 15% didn’t know) and 26% had more than 10
definitions of product (and a further 17% didn’t know). Our own experience working
with multiple global enterprise MDM initiatives more than bears witness to these findings.
The examples above beg the question whether organisations perceive the customer as a
customer of a department or of the whole enterprise; this underlines the need to change
the mindset of the organisation to start thinking and operating at an enterprise level, to
bring data together at an enterprise level and to start seeing the customer (and customer
data) as an enterprise asset.
EXECUTIVE SPONSORSHIP
Associated with the need for Enterprise thinking, is the need for effective executive
sponsorship. Somebody at the top of the organisation must own and care deeply
about the MDM initiative and expect significant return on investment through the
implementation of an enterprise solution. Again, our experience bears out this assertion.
In order for MDM programmes to be successful they require cross departmental thinking
and organisational change and therefore need C-Level buy-in and leadership. Without the
backing of senior management to make changes across the organisation and to start the
process of thinking at enterprise level then these projects will fail.
WHY MDM PROJECTS FAIL AND WHAT THIS MEANS FOR BIG DATA ENTITY WHITE PAPER
3
MANAG
4. BUSINESS CASE
As with any major business change initiative, a business case or compelling business driver
is essential for an MDM project to be successful. According to a 2010 survey by Information
Difference, only 60% of projects were progressed at that time with a robust business case.
Ultimately, all projects within an organisation are competing for resources and those whose
benefits are clearly understood stand more chance of progressing. Furthermore, those
projects without a business case are more likely to be cancelled or to be categorised as
failures, simply because quantifiable business outcomes were not defined for the project at
the outset. The probability of re-prioritisation of projects increases as organisations operate
through the current economic downturn.
Defining the business case for an MDM initiative is especially important as MDM tends to be
an enabler to future value rather than delivering direct business value itself.
The business case for MDM can be expressed in many ways including customer satisfaction,
cross sell, up sell, operational efficiency, improvements to strategic decision making,
regulatory compliance, data quality and governance. Whichever of these benefits you
ascribe to your MDM initiative, it is important to understand, document, agree and
continually measure, the value that each benefit has to which areas of the business and
when that value will be delivered.
MDM AS AN INFRASTRUCTURE SOLUTION RATHER
THAN A BUSINESS SOLUTION
This consideration is aligned with that of the business case above. An enterprise MDM
solution is an essential component of a well worked Information Management architecture
that enables an IT organisation flexibility and scalability to support changing business
priorities into the future. This is a good thing and often leads to comments from senior
executives like ‘the case for MDM is a given’. In this scenario, the implementation of MDM is
driven from an IT perspective, rather than from a business one. Whilst it is undoubtedly true
that MDM forms a cornerstone of an effective information management architecture, the
complexity of enterprise thinking and the need for business change to support it mean that
it must be driven from a business rather than an IT perspective.
Often, large companies attempt to implement multi-domain master data management
programmes in a single programme. They may use the same technological platform
(e.g. IBM Infosphere MDM or Informatica MDM) to master a number of business critical
data entities across departments, business units or functions. The technology chosen,
however, does not answer the reasons “WHY” the organisation is embarking on an MDM
initiative. The “WHY” is the business outcome that is expected from the programme. MDM
programmes should align to business objectives - the technology / infrastructure solution is
simply “HOW” you get there.
As long as an organisation allows technology to shape business decisions rather than the
opposite then the strategic goals and the business benefits hoped for from the MDM
initiative will never be reached.
ROADMAP
Too often, organisations attempt a “big bang” approach at mastering numerous data
domains across the organisation. They attempt to integrate multiple silos without really
4
ENTITY WHITE PAPER WHY MDM PROJECTS FAIL AND WHAT THIS MEANS FOR BIG DATA
ONS
5. considering what data should be within the scope of the programme and when.
A properly defined information management strategy will identify an organisation’s optimal
roadmap for deriving the most business benefit, in the shortest timeframe, from its information
management projects. Quite rightly, in today’s economic climate, time to business value should
be a critical factor in prioritising each project. However, it is important that each initiative is
implemented within the constraints of an enterprise information strategy and reference data
architecture.
It is not uncommon for organisations to see the need to master customers, vendors and
prospects at different times and in different ways and therefore to treat them as distinct
projects and deliverables and then to discover that an important part of the business case is to
identify which customers are also prospects and vendors. If the overall roadmap and business
case were understood, then customer, vendor and prospect could be mastered as a single
domain ‘Party’ – still potentially implemented as separate projects but deriving increased value
as each is implemented over time.
Another important consideration is where to start? Don’t start your MDM initiative with a
simple domain that gives limited business value. It is a common mistake to start with something
technically simple, with a clear scope and limited impact. It is important, however, that the first
project delivers real value that can be heralded as a huge success across the organisation, and
that it proves the entire concept from a technological and infrastructure perspective.
COMMUNICATIONS PLANNING
While MDM enables joined up data and therefore thinking across the organisation, it is only
possible if the people working on the project communicate to make it happen. Often, MDM
projects will be implemented across functions, across product lines and across business units
– key stakeholders will often only understand their own individual information requirements
rather than cross-enterprise requirements. This inevitably creates blockers to the success of the
project, unless an effective communications plan is put in place to mitigate their concerns.
An effective communications plan must communicate the progress and successes of the
initiative, with all successes against the business case measured and quantified; successful
information management projects are more likely to gain widespread adoption across the
enterprise if people know about them.
BUSINESS CHANGE PROGRAMME
Master Data Management programmes cause change: to data, to systems, to business
processes, to people and to the enterprise. An organisation should map out their organisation
to identify the data, systems, processes and people affected by the initiative, and how they will
be affected.
This mapping should ask questions not only of existing systems, roles and departments but also
of future ones. For example - should data governance be centralised? Who owns the mastered
data post-implementation – the department or the enterprise? How does this change existing
processes? Where does the data stewardship role fit – it didn’t exist previously – is it a central,
enterprise role now? What changes need to be made now to existing systems to manage
changes to master data? How does this affect users?
If your organisation is not mapped out and these questions are not asked, normal business
operations will be disrupted and the MDM initiative will be dropped at the first sign
of resistance to change. Andrew White, Research Vice President of Gartner, identifies
organisational change as one of the primary barriers to MDM adoption. 5
WHY MDM PROJECTS FAIL AND WHAT THIS MEANS FOR BIG DATA ENTITY WHITE PAPER
6. PRODUCT SELECTION / UNIQUE SKILLS
Information management is often misunderstood and is not a technical exercise; neither
is it a business exercise; it is both – and as such requires a unique set of skills for effective
planning, product selection and effective implementation.
According to TDWI’s Next Generation MDM report, 26% of organisations surveyed had
attempted a “homegrown” MDM solution while only 2% preferred that option in place
of dedicated MDM tools. Often such homegrown solutions were Proof of Concepts that
now require scaling across the organisation. MDM solutions have however matured far
beyond this into a comprehensive mix of data model, workflow, integration, authoring,
stewardship, matching, linking and survivorship. It is questionable whether a homegrown
solution could meet all of these objectives effectively. Given their sizeable investments in
R&D, made possible only because the solutions can be deployed with multiple customers,
only enterprise scale commercial solutions are likely to be effective long-term.
While these organisations were able to hand-code an MDM silo, a number of them will find
that they are unable to implement, govern and maintain MDM across the enterprise. Unless
you have the right people in place with the required blend of technical skills and business
understanding, your chances of successfully implementing your Master Data Management
strategy across the enterprise are negligible.
Understanding why MDM projects fail will help to mitigate these risks. The steps below
offer a practical approach for addressing these problems and for implementing MDM
successfully across a complex organisation.
INFORMATION MANAGEMENT / DATA GOVERNANCE STRATEGY
The purpose of the Information Management Strategy is to define an Information
Architecture and strategy that meets the needs of your business as it changes over time.
Once the strategy is understood and agreed, an optimal roadmap is identified for
deriving the most business benefit from your information management projects as they are
implemented incrementally - the objective is to quickly provide recommendations on areas
where possible improvements could be made based on strategic goals/drivers.
Master Data Management is an essential component of the wider enterprise information
management strategy. MDM is pivotal within an information architecture as it supplies and
maintains master data across enterprise systems.
Of course, any information management projects within your information management
strategy must each be supported by a compelling business case for implementation.
ENTERPRISE INFORMATION REFERENCE ARCHITECTURE
A successful Information Management solution architecture must enable master data to
be managed consistently across all people, processes and systems within the enterprise.
However it involves far more than just implementing a central repository of data. The
architecture and design approach should be based upon a well-defined set of configurable
components. These include:
An enterprise data model which standardises a consistent model of both reference
data and master data. It should provide a business glossary and consider both the
operational and analytical requirements of the enterprise.
6
ENTITY WHITE PAPER WHY MDM PROJECTS FAIL AND WHAT THIS MEANS FOR BIG DATA
T SOLUTION
7. Information Lifecycle Management and Data Quality components to allow new master
data and reference data to be created, collaborated, managed and retired by the
enterprise in a consistent manner.
Data Stewardship components that allow data quality issues to be managed and Identity
Analytics components to detect potential duplicates within the data.
Data profiling components that measure and monitor data quality against objective
targets set by the data governance board.
Analytical components such as data warehouses to provide enterprise level query based
reporting and event based analytics to provide real time operational intelligence.
Content Management components to manage unstructured data and to cross link it to
standardised reference data and master data.
Security and Audit components to ensure that master data can only be accessed by those
systems and people that are authorised to do so.
Integration and connectivity components to enable information to be flowed easily and
quickly to the processes and systems which need it within the enterprise.
A number of relevant Enterprise Reference architecture patterns exist such as IAAS
(Information as a Service) and SOA (Service Oriented Architecture). These two examples
promote best practice integration principles such as consistent service reuse, flexibility and
loose coupling between systems. They lower the cost of system integration and provide a
platform for growth and change without requiring a restructure of the organisation and
its systems. Other important architectural considerations include providing highly available
services, rapid performance and the ability to scale the architectural components to support
the Big Data volumes of the future.
The enterprise architecture in many organisations has typically suffered from having to
respond to pressures of growth, business and technology change. MDM and associated
information management principles provide a unique opportunity to put a reference
enterprise architectural vision in place and to begin incrementally reducing the amount of
redundant information and systems within the business.
PROJECT PRIORITISATION AND ROADMAP
A ‘heat map’ process provides an objective mechanism to identify the information pains
within an organisation and then to prioritise solution delivery within the constraints of
effective information management strategy. It is an effective mechanism to derive and
manage a programme roadmap over a period of time.
This heat map enables executive level management to visualise the information maturity of
their data entities across the organisation. It will highlight which information management
projects should be tackled first and enables the organisation to create the optimum
roadmap for tackling projects incrementally with a view to deriving maximum business
benefit.
When considering master data initiatives it is inevitable that the provision of mastered
solutions for individual data domains (Customer, Supplier, Product, Part, Location, etc) will
have different relative priorities for different organisations. Prioritising the development
and delivery of these in the context of a wider information management strategy, taking
into account the practical considerations of resourcing service delivery, is not straight
forward but leads to effective planning and management and therefore minimises the costs
and timescales of solution delivery.
7
WHY MDM PROJECTS FAIL AND WHAT THIS MEANS FOR BIG DATA ENTITY WHITE PAPER
SOLUTION
8. INCREMENTAL BENEFITS
The roadmap should allow for manageable scope within specific areas of the business
(e.g. the Customer Master) rather than attempting everything at once. The ‘how do you
eat an elephant?’ quip; answer - ‘one bite at a time’ is highly appropriate here. This
controlled focus should enable business benefits to be realised quicker, and lessons to be
learned by the organisation as it progresses projects incrementally along the roadmap.
This approach lays the foundations for information management project delivery. It
allows for a business case to be made for each stage of the plan and when each stage
is successful, against measurable and quantifiable benefits, then organisational change
is more widely accepted and trusted. This feeds the desire for and therefore speeds
the adoption of enterprise information sharing initiatives such as MDM, as long as
these quantified successes are communicated across the organisation. Approaching
your information management strategy with this “agile” approach vastly increases the
probability of success versus a more traditional “big bang” approach.
EFFECTIVE SPONSORSHIP
Effective sponsorship at the right level in the business increases the probability of MDM
project success. Executive level sponsors are more likely to fund projects that align with
the strategic objectives for the organisation. The likelihood of effective sponsorship
therefore increases when master data management projects help the organisation to
meet strategic goals. This point may seem to be a statement of the patently obvious,
but it is remarkable how many information management and MDM projects commence
without being effectively tied to business objectives and success.
Effective sponsorship, however, requires a lot more than being an advocate for an
information management programme and securing its funding. Effective sponsorship
requires that you lead with a vision for business change, that the project is funded,
and that you make those responsible for implementation accountable for realising the
business benefits outlined in the business case.
COMMUNICATIONS PLANNING
Your roadmap will be designed to meet both business and data requirements from key
stakeholders throughout the organisation. This will also create the structure of your
communications plan informing key stakeholders how their business processes will be
affected prior to, during and post implementation of information management projects.
Regular status updates should inform key stakeholders of the progress of information
management projects along the roadmap, which benefits, both tangible and intangible,
have been realised, and all benefits should be evangelised to C-Level to help speed
Information Management adoption across the enterprise.
MEASUREMENT AND METRICS
The metrics used to measure the success of your information management projects
should be linked to the business drivers outlined in the business case(s) for the
programme.
8
ENTITY WHITE PAPER WHY MDM PROJECTS FAIL AND WHAT THIS MEANS FOR BIG DATA
UTIONS
9. An important strategic goal, for example, might be increased revenue from cross-selling
and upselling. Key related metrics are therefore the improvements in customer data
quality and customer data integrity over time. Other strategic goals might include
staff efficiencies, for example a reduction in data entry processes; a key metric would
therefore be the number of man hours spent creating management reports. Key data
metrics should be included in management reports to business leaders whose strategic
goals are affected by them, so that they are engaged by ongoing data governance.
It is critically important to understand the metrics that report the efficiency of particular
business processes and to measure them before, during and after the implementation of
any master data or information management initiative. This is a key component of any
data governance programme.
Lastly, compliance to data policies, rules and standards (across business units) should
also be measured on a periodic basis to help focus the organisation on effective data
governance post information project implementation.
PROJECT AND PROGRAMME GOVERNANCE
A key consideration of MDM success is effective implementation project success; and
this can only be achieved with effective project governance. A properly governed
information management project should ideally contain the following elements:
A compelling, documented business case.
Agreed and documented business level requirements.
Unambiguous specification of project deliverables, agreed by all stakeholders.
Clearly documented projected scope.
A process for measuring that the completed project meets its original objectives.
Project sponsorship is in place, is appropriate and is being implemented effectively.
An effective project steering process.
The relationship between all internal and external groups involved in the project is
understood and documented.
Project stakeholders are identified, engaged and are communicated with effectively
at appropriate intervals.
Effective project management processes are in place.
Appropriate status and progress reporting mechanisms are in place.
Project review checkpoints and processes are in place to review that it continues to
meet its business, commercial and time goals.
Project documentation is recorded effectively and is held in a central, accessible
location.
Processes are in place for the effective management of project risks, issues and
changes.
Processes for the review of the quality of key project deliverables and of project
governance procedures.
Processes for conflict management.
A project governance approach such as this enables effective management of
information management projects and is repeatable as initiatives are progressed along
the roadmap.
9
WHY MDM PROJECTS FAIL AND WHAT THIS MEANS FOR BIG DATA ENTITY WHITE PAPER
MANAGEME
10. BUSINESS CHANGE PROGRAMME
The roadmap defined during the definition of your information management strategy will
have identified areas of the business that will be affected by the MDM programme. As long
as you understand how MDM will impact business units, systems, processes and people -
you will be able to define a business change programme to ensure that the programme is
successful. The complexity of this activity however should not be underestimated.
Furthermore, there will be organisational change to cope with as a consequence of
information management initiatives – for example a single view of customer might identify
an unexploited market opportunity that requires a new sales structure to capitalise on this
information which, in turn, might require the creation of new master data attributes.
SUMMARY
The ability to exploit the information within an organisation as an asset of the entire
enterprise is arguably the defining feature of the successful business of the future.
An effective information management strategy, of which master data management is
an essential component, is foundational to meeting the coming challenges of Big Data.
Competitive advantage from complex analytics and from Big Data is achieved through
building on a consistent information platform for the entire enterprise. This in turn can
only be implemented though a structured information management strategy and reference
architecture.
For any enterprise, large or small, getting from where they are now to this state of data
nirvana sounds like a huge task which is just too complex to undertake. This is not the case!
Through strategic planning, a structured approach to information management strategy,
sponsorship at the right level, prioritisation of delivery against incremental and measurable
business cases, understanding and managing business change, strong management and
constant communication, this elephant can be eaten and even enjoyed.
ABOUT ENTITY GROUP
Entity Group is an information management solutions specialist. Entity provides
independent consultancy, software solutions and services that exploit the value of
information and deliver competitive advantage to large scale clients across the information
management lifecycle; its services range from an information management strategic review,
through to analysis and implementation services for Big Data, data modelling, information
integration, master data management and analytics.
Entity is committed to long term collaboration with our clients and partners, most of whom
continue to work with us over many years and multiple projects. In addition to working
directly with end-user organisations, Entity’s bespoke data management and domain
expertise often sees the company called in to solve unusual or highly-challenging business
data issues on behalf of global IT services companies and software vendors.
10
ENTITY WHITE PAPER WHY MDM PROJECTS FAIL AND WHAT THIS MEANS FOR BIG DATA
GEMENT SO
11. REFERENCES
IDC: The Digital Universe in 2020: Big Data,Bigger Digital Shadows, and Biggest
Growth in the Far East
http://www.emc.com/leadership/digital-universe/index.htm
Data Quality, Governance Critical to MDM Success, Loraine Lawson
http://www.itbusinessedge.com/cm/blogs/lawson/data-quality-governance-critical-to-mdm-
success/?cs=47414
Next Generation Master Data Management, TDWI
http://tdwi.org/research/2012/04/best-practices-report-q2-next-generation-master-data-
management.aspx
Building a Robust Business Case for High Quality Master Data, Information Difference
Whitepaper, Andy Hayler, February 2010
http://www.melissadata.com/enews/business-case-for-mdm.pdf
Gartner Says Master Data Management Is Critical to Achieving Effective Information
Governance, January 19th 2012
http://www.gartner.com/newsroom/id/1898914
11
WHY MDM PROJECTS FAIL AND WHAT THIS MEANS FOR BIG DATA ENTITY WHITE PAPER
12. Entity House
980 Cornforth Drive
Kent Science Park
Sittingbourne
KENT ME9 8PX
United Kingdom
www.entity.co.uk
For more information please contact:
James Wilkinson
Chief Executive Officer, Entity Group
james.wilkinson@entity.co.uk
www.entity.co.uk