A Fitzgerald Analytics client case study about the practical, high-value first steps taken by the head of technology at Bridgestone Firestone's consumer credit company, which provides private label credit cards used by millions of customers in 5,000 retail stores selling tires and automobile repair services. The presentation describes a technology and data management gap that existed, successful steps taken to fix the gap, and the benefits achieve in quality, efficiency, and ROI on data analytics from a major data warehousing and data integration effort.
Webinar: Decoding the Mystery - How to Know if You Need a Data Catalog, a Dat...DATAVERSITY
This document discusses the importance of metadata and data governance. It describes how a data catalog can consolidate metadata from various sources like a business glossary, data dictionary, and data profiling. Automating data lineage is key to harvesting metadata at scale and establishing relationships between different metadata objects. When integrated in a data catalog, metadata provides a single source of truth about an organization's data that improves data literacy and trust.
This document provides an overview of data modeling concepts. It discusses the importance of data modeling, the basic building blocks of data models including entities, attributes, and relationships. It also covers different types of data models such as conceptual, logical, and physical models. The document discusses relational and non-relational data models as well as emerging models like object-oriented, XML, and big data models. Business rules and their role in database design are also summarized.
This framework helps organizations align Data Strategy with Business Strategy to prioritize goals around the most pressing operational needs. It introduces Data Management & Data Ability Maturity Matrix to visualize the core path of business digital transformation, which is easy to understand and follow. And it provides the standard template for implementation, which can share the flexibility to engage applications of different industries.
Because every organization produces and propagates data as part of their day-to-day operations, data trends are becoming more and more important in the mainstream business world’s consciousness. For many organizations in various industries, though, comprehension of this development begins and ends with buzzwords such as “big data,” “NoSQL,” “data scientist,” and so on. Few realize that any and all solutions to their business problems, regardless of platform or relevant technology, rely to a critical extent on the data model supporting them. As such, Data Modeling is not an optional task for an organization’s data effort, but rather a vital activity that facilitates the solutions driving your business. Since quality engineering/architecture work products do not happen accidentally, the more your organization depends on automation, the more important are the data models driving the engineering and architecture activities o
How to Use a Semantic Layer to Deliver Actionable Insights at ScaleDATAVERSITY
Learn about using a semantic layer to enable actionable insights for everyone and streamline data and analytics access throughout your organization. This session will offer practical advice based on a decade of experience making semantic layers work for Enterprise customers.
Attend this session to learn about:
- Delivering critical business data to users faster than ever at scale using a semantic layer
- Enabling data teams to model and deliver a semantic layer on data in the cloud.
- Maintaining a single source of governed metrics and business data
- Achieving speed of thought query performance and consistent KPIs across any BI/AI tool like Excel, Power BI, Tableau, Looker, DataRobot, Databricks and more.
- Providing dimensional analysis capability that accelerates performance with no need to extract data from the cloud data warehouse
Who should attend this session?
Data & Analytics leaders and practitioners (e.g., Chief Data Officers, data scientists, data literacy, business intelligence, and analytics professionals).
The document discusses different approaches to data resource management. It describes traditional file processing, where data is organized across independent files, leading to issues like data redundancy and lack of integration. The modern approach is database management, which consolidates organizational data into centralized databases managed by a database management system (DBMS). The DBMS allows many applications to access integrated data and maintains data quality. The chapter also covers logical and physical database design, different database structures, and types of databases like operational, distributed, external, and data warehouses.
This document discusses how to create a data governance dashboard by connecting it to Trillium Software's data quality platform. It recommends including business rule metadata, the rules library, decision points, and time series analysis in the dashboard. It demonstrates how to use the OLE DB provider to abstract the platform's architecture and define tables to retrieve metrics, rules results, metadata, and more. Connecting the dashboard to the repository in this way allows efficient ongoing monitoring of data quality.
Webinar: Decoding the Mystery - How to Know if You Need a Data Catalog, a Dat...DATAVERSITY
This document discusses the importance of metadata and data governance. It describes how a data catalog can consolidate metadata from various sources like a business glossary, data dictionary, and data profiling. Automating data lineage is key to harvesting metadata at scale and establishing relationships between different metadata objects. When integrated in a data catalog, metadata provides a single source of truth about an organization's data that improves data literacy and trust.
This document provides an overview of data modeling concepts. It discusses the importance of data modeling, the basic building blocks of data models including entities, attributes, and relationships. It also covers different types of data models such as conceptual, logical, and physical models. The document discusses relational and non-relational data models as well as emerging models like object-oriented, XML, and big data models. Business rules and their role in database design are also summarized.
This framework helps organizations align Data Strategy with Business Strategy to prioritize goals around the most pressing operational needs. It introduces Data Management & Data Ability Maturity Matrix to visualize the core path of business digital transformation, which is easy to understand and follow. And it provides the standard template for implementation, which can share the flexibility to engage applications of different industries.
Because every organization produces and propagates data as part of their day-to-day operations, data trends are becoming more and more important in the mainstream business world’s consciousness. For many organizations in various industries, though, comprehension of this development begins and ends with buzzwords such as “big data,” “NoSQL,” “data scientist,” and so on. Few realize that any and all solutions to their business problems, regardless of platform or relevant technology, rely to a critical extent on the data model supporting them. As such, Data Modeling is not an optional task for an organization’s data effort, but rather a vital activity that facilitates the solutions driving your business. Since quality engineering/architecture work products do not happen accidentally, the more your organization depends on automation, the more important are the data models driving the engineering and architecture activities o
How to Use a Semantic Layer to Deliver Actionable Insights at ScaleDATAVERSITY
Learn about using a semantic layer to enable actionable insights for everyone and streamline data and analytics access throughout your organization. This session will offer practical advice based on a decade of experience making semantic layers work for Enterprise customers.
Attend this session to learn about:
- Delivering critical business data to users faster than ever at scale using a semantic layer
- Enabling data teams to model and deliver a semantic layer on data in the cloud.
- Maintaining a single source of governed metrics and business data
- Achieving speed of thought query performance and consistent KPIs across any BI/AI tool like Excel, Power BI, Tableau, Looker, DataRobot, Databricks and more.
- Providing dimensional analysis capability that accelerates performance with no need to extract data from the cloud data warehouse
Who should attend this session?
Data & Analytics leaders and practitioners (e.g., Chief Data Officers, data scientists, data literacy, business intelligence, and analytics professionals).
The document discusses different approaches to data resource management. It describes traditional file processing, where data is organized across independent files, leading to issues like data redundancy and lack of integration. The modern approach is database management, which consolidates organizational data into centralized databases managed by a database management system (DBMS). The DBMS allows many applications to access integrated data and maintains data quality. The chapter also covers logical and physical database design, different database structures, and types of databases like operational, distributed, external, and data warehouses.
This document discusses how to create a data governance dashboard by connecting it to Trillium Software's data quality platform. It recommends including business rule metadata, the rules library, decision points, and time series analysis in the dashboard. It demonstrates how to use the OLE DB provider to abstract the platform's architecture and define tables to retrieve metrics, rules results, metadata, and more. Connecting the dashboard to the repository in this way allows efficient ongoing monitoring of data quality.
Keys to Creating an Analytics-Driven CultureDATAVERSITY
Changing company culture takes time, energy and focus, as well as consistent reinforcement long after the breakrooms’ company culture posters start to fade. Creating an analytics-driven culture may be even harder to grow and sustain. Yet the rewards are vast for companies whose culture embodies an analytics-first mindset – and for those who use the derived insights to improve operational efficiency and decision-making, generate new revenue and prevent risk and fraud.
This webinar will offer advice and real-world examples on how to:
Develop and utilize an analytics-focused vision statement
Engage senior leaders to support analytics as a business problem-solver
Communication best practices to engage participants in the culture change
Use tried-and-tested best practices and approaches to build an analytics-driven culture
This document discusses Gemini, a self-service business intelligence (BI) solution that aims to unite business users and IT. It does this by providing an easy to use and familiar tool (Excel) for business users to access and analyze data in a managed and compliant way. The solution leverages existing technologies like SQL Server, SharePoint and Excel Services to provide capabilities like data feeds, reports as data sources and collaboration in a secure and scalable manner while giving IT management and oversight of resources and data. The goal is to empower business users with self-service capabilities while also addressing IT needs for compliance, reliability and maintenance.
DAS Slides: Data Quality Best PracticesDATAVERSITY
Tackling Data Quality problems requires more than a series of tactical, one-off improvement projects. By their nature, many Data Quality problems extend across and often beyond an organization. Addressing these issues requires a holistic architectural approach combining people, process, and technology. Join Nigel Turner and Donna Burbank as they provide practical ways to control Data Quality issues in your organization.
This document discusses conceptual data modeling and Entity-Relationship diagrams. It defines key terms like entities, attributes, relationships and cardinality. It explains how to represent these concepts in ER diagrams and discusses best practices for naming relationships and defining domains. The goals of conceptual data modeling are to accurately represent organizational data and rules through diagrams and establish consistency between the data, process and logic models.
This document discusses enterprise data management. It defines enterprise data management as removing organizational data issues by defining accurate, consistent, and transparent data that can be created, integrated, disseminated, and managed across enterprise applications in a timely manner. It also discusses the need for a structured data delivery strategy from producers to consumers. The document then outlines some key enterprise data categories and provides a conceptual and logical view of an enterprise master data lineage architecture with data flowing between transactional systems, a data management layer, and analytics.
The document discusses data governance and why it is an imperative activity. It provides a historical perspective on data governance, noting that as data became more complex and valuable, the need for formal governance increased. The document outlines some key concepts for a successful data governance program, including having clearly defined policies covering data assets and processes, and establishing a strong culture that values data. It argues that proper data governance is now critical to business success in the same way as other core functions like finance.
This document discusses different types of data models, including hierarchical, network, relational, and object-oriented models. It focuses on explaining the relational model. The relational model organizes data into tables with rows and columns and handles relationships using keys. It allows for simple and symmetric data retrieval and integrity through mechanisms like normalization. The relational model is well-suited for the database assignment scenario because it supports linking data across multiple tables using primary and foreign keys, and provides query capabilities through SQL.
OLAP (online analytical processing) allows users to easily extract and view data from different perspectives. It was invented by Edgar Codd in the 1980s and uses multidimensional data structures called cubes to store and analyze data. OLAP utilizes either a multidimensional (MOLAP), relational (ROLAP), or hybrid (HOLAP) approach to store cube data in databases and provide interactive analysis of data.
This document provides an overview of telecommunications and networking concepts. It defines key terms like networks, Metcalfe's Law, middleware, digital networks, wireless technologies, intranets, extranets, and different types of networks including WANs, LANs, VPNs, and client-server networks. The document also discusses how telecommunications and the internet are revolutionizing business through applications like e-commerce, collaboration, and information portals.
A successful data governance capability requires a strategy to align regulatory drivers and technology enhancement initiatives with business needs and objectives, taking into account the organizational, technological and cultural changes that will need to take place.
Visualisation & Storytelling in Data Science & AnalyticsFelipe Rego
The document provides an overview of data visualization and storytelling in data science and analytics. It discusses key concepts like what data visualization is, compelling reasons to visualize data like Anscombe's Quartet, visualization in the context of analytics workflows, components of effective storytelling, considerations for presentation, guidelines for data storytelling, and examples of interesting data visualizations. Throughout the document, the author emphasizes best practices like keeping visualizations clear, addressing the intended audience, and avoiding bias.
PwC is a global network of firms providing professional services including assurance, tax, and advisory services. This training module provides an introduction to metadata management, including defining metadata, the metadata lifecycle, ensuring metadata quality, and using controlled vocabularies. Metadata exchanges and aggregation are important for interoperability.
Etl And Data Test Guidelines For Large ApplicationsWayne Yaddow
This document provides guidelines for testing the quality of data, ETL processes, and SQL queries during the development of a data warehouse. It outlines steps to verify data extracted from source systems, transformed and loaded into staging tables, cleansed and consolidated in staging, and finally transformed and loaded into the data warehouse operational tables and data marts. The guidelines describe analyzing source data quality, verifying ETL processes, matching consolidated data, and transforming data according to business rules.
Introduction of Physical Database Design Process
Designing Fields
Choosing Data Types
Controlling Data Integrity
Denormalizing and Partitioning Data
Designing Physical Database Files
File Organizations
Clustering Files
Indexes
Optimizing Queries
Data Governance Best Practices and Lessons LearnedDATAVERSITY
Best practices and lessons learned are powerful tools used to assess an organization’s readiness and initial activities associated with delivering a Data Governance program. There are two criteria to determine if something is best practice for your organization. And the definition of data governance best practice is best way to learn from others and begin with the end in mind.
Bob Seiner will share industry data governance best practices in this month’s installment of the RWDG webinar series. Learn how to use the best practices defined in this webinar to address opportunities to improve your organization’s data governance implementation. Attend this webinar and learn that assessing your organization may not be as difficult as you think.
During this webinar Bob will discuss:
How to define data governance best practices for your organization
Criteria used to determine if a practice is best practice
How to assess your organization against industry best practice
Assessing risks associated with best practice gaps
Addressing opportunities to improve gaps uncovered in the assessment
DAS Slides: Data Governance - Combining Data Management with Organizational ...DATAVERSITY
Data Governance is both a technical and an organizational discipline, and getting Data Governance right requires a combination of Data Management fundamentals aligned with organizational change and stakeholder buy-in. Join Nigel Turner and Donna Burbank as they provide an architecture-based approach to aligning business motivation, organizational change, Metadata Management, Data Architecture and more in a concrete, practical way to achieve success in your organization.
This presentation was part of the IDS Webinar on Data Governance. It gives a brief overview of the history on Data Governance, describes how governing data has to be further developed in the era of business and data ecosystems, and outlines the contribution of the International Data Spaces Association on the topic.
How many times have you been surprised, and frustrated, to learn your IT capabilities won’t support a new or key business objective? Given the rapidly changing healthcare industry and multitude of new initiatives, this scenario happens all the time.
So how can you help ensure your IT components will work together, and can be leveraged to drive business results?
You need a blueprint — a way to align IT to the business – an IT Enterprise Architecture.
A sound Enterprise Architecture ensures your business is supported by IT components working together to deliver both a return-on-investment and projected business results.
Jaime Fitzgerald: A Master Data Management Road-Trip - Presented Enterprise D...Fitzgerald Analytics, Inc.
This document summarizes a company's journey to implement Master Data Management (MDM). It describes how initially the company's data environment was messy, with inconsistent data across systems. This caused tensions between business users and IT. The company embarked on an MDM journey in three phases: recognizing problems, understanding MDM, and executing MDM initiatives. Key landmarks included stopping immediate issues, gaining organizational buy-in, building an MDM function, and focusing on high-impact use cases like a new data warehouse. Lessons learned emphasized gaining executive buy-in and taking an iterative approach of building capabilities and applying them.
Data Ownership:
Most companies and organizations have this notion that data governance should be taken care of ,
by the Information Technology department, because IT owns the system which stores the data.
The owner of the data is responsible for providing attributes to the data and answerable to any questions regarding data.
The people answerable to these kinds of data are generally the ones involved in defining business rules,
data cleaning and consolidation.?
Data Stewardship:?
Data stewards should be favorably those people who are familiar with the data. It is often seen that
there is need to deploy several people, to handle and correct data,
whereas a single data steward could have done the same job. Since the data being handled involves
organizational level data, it is important that there are governance rules for this process.?
If there is some certain rule in the data which causes large data volumes to fail, this rule should be fixed while data cleansing.
So it is important to take care of the amount of clean data sent to the stewards,
since we are not aware of which rules might trigger what amount of data.?
Choice of data stewards is again a difficult selection.
Data Security:?
Although the master data is data on organization level, but there is some confidentiality level linked to it.?
Not every employee has the authorization to view its aspects.
Security rules can be applied to the data.
The various departments in the organization must set different rules to the data they own.
They need to grant permissions to these rules , so that the user can view the data.
A large company can have data sourced out of many regions.
It is to be ensured that they are responsible to correct only their own data.?
Data survivorship:
There are some guidelines which are set up by data governance.
These rules can often change over hthe time according to new data sources being added.
The changes made to the data , are communicated to the organization so that data stewards and users can understand the process.
So from a data steward's point of view, it is important to apply security rules to the people who are involved
in data handling and correction. This is a result of how data governance and data security can be applied while implementing MDM.?
?
Keys to Creating an Analytics-Driven CultureDATAVERSITY
Changing company culture takes time, energy and focus, as well as consistent reinforcement long after the breakrooms’ company culture posters start to fade. Creating an analytics-driven culture may be even harder to grow and sustain. Yet the rewards are vast for companies whose culture embodies an analytics-first mindset – and for those who use the derived insights to improve operational efficiency and decision-making, generate new revenue and prevent risk and fraud.
This webinar will offer advice and real-world examples on how to:
Develop and utilize an analytics-focused vision statement
Engage senior leaders to support analytics as a business problem-solver
Communication best practices to engage participants in the culture change
Use tried-and-tested best practices and approaches to build an analytics-driven culture
This document discusses Gemini, a self-service business intelligence (BI) solution that aims to unite business users and IT. It does this by providing an easy to use and familiar tool (Excel) for business users to access and analyze data in a managed and compliant way. The solution leverages existing technologies like SQL Server, SharePoint and Excel Services to provide capabilities like data feeds, reports as data sources and collaboration in a secure and scalable manner while giving IT management and oversight of resources and data. The goal is to empower business users with self-service capabilities while also addressing IT needs for compliance, reliability and maintenance.
DAS Slides: Data Quality Best PracticesDATAVERSITY
Tackling Data Quality problems requires more than a series of tactical, one-off improvement projects. By their nature, many Data Quality problems extend across and often beyond an organization. Addressing these issues requires a holistic architectural approach combining people, process, and technology. Join Nigel Turner and Donna Burbank as they provide practical ways to control Data Quality issues in your organization.
This document discusses conceptual data modeling and Entity-Relationship diagrams. It defines key terms like entities, attributes, relationships and cardinality. It explains how to represent these concepts in ER diagrams and discusses best practices for naming relationships and defining domains. The goals of conceptual data modeling are to accurately represent organizational data and rules through diagrams and establish consistency between the data, process and logic models.
This document discusses enterprise data management. It defines enterprise data management as removing organizational data issues by defining accurate, consistent, and transparent data that can be created, integrated, disseminated, and managed across enterprise applications in a timely manner. It also discusses the need for a structured data delivery strategy from producers to consumers. The document then outlines some key enterprise data categories and provides a conceptual and logical view of an enterprise master data lineage architecture with data flowing between transactional systems, a data management layer, and analytics.
The document discusses data governance and why it is an imperative activity. It provides a historical perspective on data governance, noting that as data became more complex and valuable, the need for formal governance increased. The document outlines some key concepts for a successful data governance program, including having clearly defined policies covering data assets and processes, and establishing a strong culture that values data. It argues that proper data governance is now critical to business success in the same way as other core functions like finance.
This document discusses different types of data models, including hierarchical, network, relational, and object-oriented models. It focuses on explaining the relational model. The relational model organizes data into tables with rows and columns and handles relationships using keys. It allows for simple and symmetric data retrieval and integrity through mechanisms like normalization. The relational model is well-suited for the database assignment scenario because it supports linking data across multiple tables using primary and foreign keys, and provides query capabilities through SQL.
OLAP (online analytical processing) allows users to easily extract and view data from different perspectives. It was invented by Edgar Codd in the 1980s and uses multidimensional data structures called cubes to store and analyze data. OLAP utilizes either a multidimensional (MOLAP), relational (ROLAP), or hybrid (HOLAP) approach to store cube data in databases and provide interactive analysis of data.
This document provides an overview of telecommunications and networking concepts. It defines key terms like networks, Metcalfe's Law, middleware, digital networks, wireless technologies, intranets, extranets, and different types of networks including WANs, LANs, VPNs, and client-server networks. The document also discusses how telecommunications and the internet are revolutionizing business through applications like e-commerce, collaboration, and information portals.
A successful data governance capability requires a strategy to align regulatory drivers and technology enhancement initiatives with business needs and objectives, taking into account the organizational, technological and cultural changes that will need to take place.
Visualisation & Storytelling in Data Science & AnalyticsFelipe Rego
The document provides an overview of data visualization and storytelling in data science and analytics. It discusses key concepts like what data visualization is, compelling reasons to visualize data like Anscombe's Quartet, visualization in the context of analytics workflows, components of effective storytelling, considerations for presentation, guidelines for data storytelling, and examples of interesting data visualizations. Throughout the document, the author emphasizes best practices like keeping visualizations clear, addressing the intended audience, and avoiding bias.
PwC is a global network of firms providing professional services including assurance, tax, and advisory services. This training module provides an introduction to metadata management, including defining metadata, the metadata lifecycle, ensuring metadata quality, and using controlled vocabularies. Metadata exchanges and aggregation are important for interoperability.
Etl And Data Test Guidelines For Large ApplicationsWayne Yaddow
This document provides guidelines for testing the quality of data, ETL processes, and SQL queries during the development of a data warehouse. It outlines steps to verify data extracted from source systems, transformed and loaded into staging tables, cleansed and consolidated in staging, and finally transformed and loaded into the data warehouse operational tables and data marts. The guidelines describe analyzing source data quality, verifying ETL processes, matching consolidated data, and transforming data according to business rules.
Introduction of Physical Database Design Process
Designing Fields
Choosing Data Types
Controlling Data Integrity
Denormalizing and Partitioning Data
Designing Physical Database Files
File Organizations
Clustering Files
Indexes
Optimizing Queries
Data Governance Best Practices and Lessons LearnedDATAVERSITY
Best practices and lessons learned are powerful tools used to assess an organization’s readiness and initial activities associated with delivering a Data Governance program. There are two criteria to determine if something is best practice for your organization. And the definition of data governance best practice is best way to learn from others and begin with the end in mind.
Bob Seiner will share industry data governance best practices in this month’s installment of the RWDG webinar series. Learn how to use the best practices defined in this webinar to address opportunities to improve your organization’s data governance implementation. Attend this webinar and learn that assessing your organization may not be as difficult as you think.
During this webinar Bob will discuss:
How to define data governance best practices for your organization
Criteria used to determine if a practice is best practice
How to assess your organization against industry best practice
Assessing risks associated with best practice gaps
Addressing opportunities to improve gaps uncovered in the assessment
DAS Slides: Data Governance - Combining Data Management with Organizational ...DATAVERSITY
Data Governance is both a technical and an organizational discipline, and getting Data Governance right requires a combination of Data Management fundamentals aligned with organizational change and stakeholder buy-in. Join Nigel Turner and Donna Burbank as they provide an architecture-based approach to aligning business motivation, organizational change, Metadata Management, Data Architecture and more in a concrete, practical way to achieve success in your organization.
This presentation was part of the IDS Webinar on Data Governance. It gives a brief overview of the history on Data Governance, describes how governing data has to be further developed in the era of business and data ecosystems, and outlines the contribution of the International Data Spaces Association on the topic.
How many times have you been surprised, and frustrated, to learn your IT capabilities won’t support a new or key business objective? Given the rapidly changing healthcare industry and multitude of new initiatives, this scenario happens all the time.
So how can you help ensure your IT components will work together, and can be leveraged to drive business results?
You need a blueprint — a way to align IT to the business – an IT Enterprise Architecture.
A sound Enterprise Architecture ensures your business is supported by IT components working together to deliver both a return-on-investment and projected business results.
Jaime Fitzgerald: A Master Data Management Road-Trip - Presented Enterprise D...Fitzgerald Analytics, Inc.
This document summarizes a company's journey to implement Master Data Management (MDM). It describes how initially the company's data environment was messy, with inconsistent data across systems. This caused tensions between business users and IT. The company embarked on an MDM journey in three phases: recognizing problems, understanding MDM, and executing MDM initiatives. Key landmarks included stopping immediate issues, gaining organizational buy-in, building an MDM function, and focusing on high-impact use cases like a new data warehouse. Lessons learned emphasized gaining executive buy-in and taking an iterative approach of building capabilities and applying them.
Data Ownership:
Most companies and organizations have this notion that data governance should be taken care of ,
by the Information Technology department, because IT owns the system which stores the data.
The owner of the data is responsible for providing attributes to the data and answerable to any questions regarding data.
The people answerable to these kinds of data are generally the ones involved in defining business rules,
data cleaning and consolidation.?
Data Stewardship:?
Data stewards should be favorably those people who are familiar with the data. It is often seen that
there is need to deploy several people, to handle and correct data,
whereas a single data steward could have done the same job. Since the data being handled involves
organizational level data, it is important that there are governance rules for this process.?
If there is some certain rule in the data which causes large data volumes to fail, this rule should be fixed while data cleansing.
So it is important to take care of the amount of clean data sent to the stewards,
since we are not aware of which rules might trigger what amount of data.?
Choice of data stewards is again a difficult selection.
Data Security:?
Although the master data is data on organization level, but there is some confidentiality level linked to it.?
Not every employee has the authorization to view its aspects.
Security rules can be applied to the data.
The various departments in the organization must set different rules to the data they own.
They need to grant permissions to these rules , so that the user can view the data.
A large company can have data sourced out of many regions.
It is to be ensured that they are responsible to correct only their own data.?
Data survivorship:
There are some guidelines which are set up by data governance.
These rules can often change over hthe time according to new data sources being added.
The changes made to the data , are communicated to the organization so that data stewards and users can understand the process.
So from a data steward's point of view, it is important to apply security rules to the people who are involved
in data handling and correction. This is a result of how data governance and data security can be applied while implementing MDM.?
?
This presentation will cover the definition of Master Data Management, describe potential MDM hub architectures, outline 5 essential elements of MDM, and describe 11 real-world best practices for MDM and data governance, based on years of experience in the field.
• History of Data Management
• Business Drivers for implementation of data governance • Building Data Strategy & Governance Framework
• Data Management Maturity Models
• Data Quality Management
• Metadata and Governance
• Metadata Management
• Data Governance Stakeholder Communication Strategy
Master data management executive mdm buy in business case (2)Maria Pulsoni-Cicio
The document provides guidance on gaining executive support for master data management (MDM) projects. It recommends quantifying the hidden costs of bad data, conducting interviews with stakeholders across business units to understand data issues, and analyzing the findings to build a business case that shows the specific financial benefits of implementing MDM. Key steps include identifying stakeholders in IT and business functions, preparing interview questions tailored to different roles, interviewing a wide range of staff, and using the results to quantify savings and improved revenues from reducing data problems.
This document provides an overview of a session on business intelligence, data science, and data mining. The goals of the class are to understand how to solve business problems using data analytics, various tools and methods for implementing solutions, and how to store and access large amounts of data. The focus areas include data warehousing, data mining, simulation, and deriving profitable business actions from databases. Popular tools mentioned include RapidMiner, R, Excel, SQL, Python, Weka, KNIME, Hadoop, SAS, and Microsoft SQL Server. Benefits of business intelligence include increased profitability, decreased costs and risks, and improved customer relationship management.
- The document discusses the importance of data governance for organizations to become data-driven and overcome challenges around revenue, costs, compliance, and risks.
- It outlines six steps to implement smart data governance including understanding data flows, business processes, and the information supply chain.
- Effective data governance requires coordinating different groups like architects, stewards, IT, and business to define standards, monitor compliance, and ensure information is used to achieve organizational goals.
1. The document discusses big data and the need to generate actionable insights from large amounts of data. It notes big data can help solve important problems in healthcare, transportation, energy and other industries.
2. Current methods for working with big data are not scalable and take too long to produce insights. More focus is needed on generating insights faster through applications that better leverage data scientists and domain experts.
3. Trident Capital has invested in big data solutions that produce quicker, higher-value insights for specific industries like healthcare, transportation, and energy. These more targeted industry applications provide a faster path to return on investment.
In this presentation, we'll help you better understand Master Data Management (MDM) and data governance, present some useful MDM and data governance best practices, talk about what works and what doesn’t, cover the importance of a holistic approach, and discuss how to get the political aspects right.
This document provides an agenda and overview for a data warehousing training session. The agenda covers topics such as data warehouse introductions, reviewing relational database management systems and SQL commands, and includes a case study discussion with Q&A. Background information is also provided on the project manager leading the training.
Data come from everywhere and everyone, but we need to be able to analyze them in a way to improve performances
Companies need to work faster but they also have to use those data to anticipate and innovate
DataOps: Nine steps to transform your data science impact Strata London May 18Harvinder Atwal
According to Forrester Research, only 22% of companies are currently seeing a significant return from data science expenditures. Most data science implementations are high-cost IT projects, local applications that are not built to scale for production workflows, or laptop decision support projects that never impact customers. Despite this high failure rate, we keep hearing the same mantra and solutions over and over again. Everybody talks about how to create models, but not many people talk about getting them into production where they can impact customers.
Harvinder Atwal offers an entertaining and practical introduction to DataOps, a new and independent approach to delivering data science value at scale, used at companies like Facebook, Uber, LinkedIn, Twitter, and eBay. The key to adding value through DataOps is to adapt and borrow principles from Agile, Lean, and DevOps. However, DataOps is not just about shipping working machine learning models; it starts with better alignment of data science with the rest of the organization and its goals. Harvinder shares experience-based solutions for increasing your velocity of value creation, including Agile prioritization and collaboration, new operational processes for an end-to-end data lifecycle, developer principles for data scientists, cloud solution architectures to reduce data friction, self-service tools giving data scientists freedom from bottlenecks, and more. The DataOps methodology will enable you to eliminate daily barriers, putting your data scientists in control of delivering ever-faster cutting-edge innovation for your organization and customers.
This document discusses simplifying analytics strategies. It recommends pursuing a simpler path to insights by accelerating data through a hybrid technology environment. This allows for fast delivery of analytics to improve service quality. It also recommends delegating work to analytics technologies like business intelligence and data visualization to more easily uncover patterns. Different analytic techniques like data discovery, applications, and machine learning can further help companies gain insights from their data in a simplified manner. The path to insights is unique for each company based on their goals, data, and technologies.
IRM Data Governance Conference February 2009, London. Presentation given on the Data Governance challenges being faced by BP and the approaches to address them.
Master Data Management's Place in the Data Governance Landscape CCG
This document provides an overview of master data management and how it relates to data governance. It defines key concepts like master data, reference data, and different master data management architectural models. It discusses how master data management aligns with and supports data governance objectives. Specifically, it notes that MDM should not be implemented without formal data quality and governance programs already in place. It also explains how various data governance functions like ownership, policies and standards apply to master data.
The pioneers in the big data space have battle scars and have learnt many of the lessons in this report the hard way. But if you are a general manger & just embarking on the big data journey, you should now have what they call the 'second mover advantage’. My hope is that this report helps you better leverage your second mover advantage. The goal here is to shed some light on the people & process issues in building a central big data analytics function
Why Everything You Know About bigdata Is A LieSunil Ranka
As a big data technologist, you can bet that you have heard it all: every crazy claim, myth, and outright lie about what big data is and what it isn't that you can imagine, and probably a few that you can't.If your company has a big data initiative or is considering one, you should be aware of these false statements and the reasons why they are wrong.
The challenges of big data, how data capable is your business? DQM Group Internet World
The document discusses the challenges of big data and how to measure an organization's data capabilities. It outlines the opportunities big data provides, such as recalculating risk portfolios and analyzing social media data. However, big data also presents challenges like needing a defined data strategy and ensuring talent and technology are aligned. It introduces the data maturity curve to assess an organization's data management practices and provides examples of metrics to measure people, processes, and technology. The document emphasizes having a data strategy, considering third parties and future legislation, and thinking carefully about current capabilities before pursuing big data initiatives.
The document discusses the challenges clients face with bad customer data, including inconsistent data between systems, lack of data standards and ownership, difficulty retrieving archived data, and high costs of data issues. It provides examples of data quality problems that have cost companies millions or billions of dollars. The document advocates implementing data management and architecture practices to address these challenges and ensure accurate, consistent and secure customer data.
Similar to Data Management: Case Study Presented @ Enterprise Data World 2010 (20)
Navigating the world of forex trading can be challenging, especially for beginners. To help you make an informed decision, we have comprehensively compared the best forex brokers in India for 2024. This article, reviewed by Top Forex Brokers Review, will cover featured award winners, the best forex brokers, featured offers, the best copy trading platforms, the best forex brokers for beginners, the best MetaTrader brokers, and recently updated reviews. We will focus on FP Markets, Black Bull, EightCap, IC Markets, and Octa.
Tata Group Dials Taiwan for Its Chipmaking Ambition in Gujarat’s DholeraAvirahi City Dholera
The Tata Group, a titan of Indian industry, is making waves with its advanced talks with Taiwanese chipmakers Powerchip Semiconductor Manufacturing Corporation (PSMC) and UMC Group. The goal? Establishing a cutting-edge semiconductor fabrication unit (fab) in Dholera, Gujarat. This isn’t just any project; it’s a potential game changer for India’s chipmaking aspirations and a boon for investors seeking promising residential projects in dholera sir.
Visit : https://www.avirahi.com/blog/tata-group-dials-taiwan-for-its-chipmaking-ambition-in-gujarats-dholera/
Structural Design Process: Step-by-Step Guide for BuildingsChandresh Chudasama
The structural design process is explained: Follow our step-by-step guide to understand building design intricacies and ensure structural integrity. Learn how to build wonderful buildings with the help of our detailed information. Learn how to create structures with durability and reliability and also gain insights on ways of managing structures.
Best practices for project execution and deliveryCLIVE MINCHIN
A select set of project management best practices to keep your project on-track, on-cost and aligned to scope. Many firms have don't have the necessary skills, diligence, methods and oversight of their projects; this leads to slippage, higher costs and longer timeframes. Often firms have a history of projects that simply failed to move the needle. These best practices will help your firm avoid these pitfalls but they require fortitude to apply.
At Techbox Square, in Singapore, we're not just creative web designers and developers, we're the driving force behind your brand identity. Contact us today.
How MJ Global Leads the Packaging Industry.pdfMJ Global
MJ Global's success in staying ahead of the curve in the packaging industry is a testament to its dedication to innovation, sustainability, and customer-centricity. By embracing technological advancements, leading in eco-friendly solutions, collaborating with industry leaders, and adapting to evolving consumer preferences, MJ Global continues to set new standards in the packaging sector.
Anny Serafina Love - Letter of Recommendation by Kellen Harkins, MS.AnnySerafinaLove
This letter, written by Kellen Harkins, Course Director at Full Sail University, commends Anny Love's exemplary performance in the Video Sharing Platforms class. It highlights her dedication, willingness to challenge herself, and exceptional skills in production, editing, and marketing across various video platforms like YouTube, TikTok, and Instagram.
Discover timeless style with the 2022 Vintage Roman Numerals Men's Ring. Crafted from premium stainless steel, this 6mm wide ring embodies elegance and durability. Perfect as a gift, it seamlessly blends classic Roman numeral detailing with modern sophistication, making it an ideal accessory for any occasion.
https://rb.gy/usj1a2
Part 2 Deep Dive: Navigating the 2024 Slowdownjeffkluth1
Introduction
The global retail industry has weathered numerous storms, with the financial crisis of 2008 serving as a poignant reminder of the sector's resilience and adaptability. However, as we navigate the complex landscape of 2024, retailers face a unique set of challenges that demand innovative strategies and a fundamental shift in mindset. This white paper contrasts the impact of the 2008 recession on the retail sector with the current headwinds retailers are grappling with, while offering a comprehensive roadmap for success in this new paradigm.
3 Simple Steps To Buy Verified Payoneer Account In 2024SEOSMMEARTH
Buy Verified Payoneer Account: Quick and Secure Way to Receive Payments
Buy Verified Payoneer Account With 100% secure documents, [ USA, UK, CA ]. Are you looking for a reliable and safe way to receive payments online? Then you need buy verified Payoneer account ! Payoneer is a global payment platform that allows businesses and individuals to send and receive money in over 200 countries.
If You Want To More Information just Contact Now:
Skype: SEOSMMEARTH
Telegram: @seosmmearth
Gmail: seosmmearth@gmail.com
3 Simple Steps To Buy Verified Payoneer Account In 2024
Data Management: Case Study Presented @ Enterprise Data World 2010
1. Tales from a Master Data Management Road Trip
March 17, 2010
1:30 – 2:30 pm
Architects of Fact-Based Decisions™
Jaime FitzgeraldArt Garanich
2. 22010-3-17 Enterprise Data World Presentation
Table of Contents
Overview: Why Did We Embark on This Journey?2
Key Landmarks on the Journey3
A Crucial Turning Point: Moving to Execution4
Results: What Makes it All Worthwhile6
Lessons Learned5
Introduction1
3. 32010-3-17 Enterprise Data World Presentation
Table of Contents
Overview: Why Did We Embark on This Journey?2
Key Landmarks on the Journey3
A Crucial Turning Point: Moving to Execution4
Results: What Makes it All Worthwhile6
Lessons Learned5
Introduction1
4. 42010-3-17 Enterprise Data World Presentation
Introduction:
What Are We Going to Cover Today?
Today, we would like to:
Introduce ourselves
Share our experiences on the “Journey of MDM Transformation”
Share the lessons we’ve learned – what worked well, what didn’t
Answer questions you may have
Encourage others to start their own transformation!
5. 52010-3-17 Enterprise Data World Presentation
Art Garanich
Who Are We?
Architects of Fact-Based Decisions™
25 Years in Technology
Focus on legacy modernization
Enjoys metaphors and parenting
13 years in Management Consulting
Focus on the strategic value of data, and
helping companies profit from it
Enjoys cycling and parenting
Private label credit card issuer
Subsidiary of Bridgestone Firestone
Boutique strategy consulting firm
focused on fact-based decisions
Takes a holistic approach to turning
“data into dollars”
About My
Company
About Me
Jaime Fitzgerald
6. 62010-3-17 Enterprise Data World Presentation
Table of Contents
Introduction1
Overview: Why Did We Embark on This Journey?2
Key Landmarks on the Journey3
A Crucial Turning Point: Moving to Execution4
Results: What Makes it All Worthwhile6
Lessons Learned5
7. 72010-3-17 Enterprise Data World Presentation
Before our Journey, the Data Environment at CFNA was “Messy”
CFNA’s data environment was not actively managed…causing pain and tension in many places
Prior to our MDM Transformation, we faced:
Multiple platforms
Numerous locations of data
Limited documentation of:
– Data locations
– Data elements
– Relationships between elements
– Business rules
– Business purposes & users of data
– Data flow
– Existing documentation not always utilized
No holistic view of our customers
Extremely time-consuming to pull new information
Significant tension between business users,
analysts, and IT staff
Our Legacy Data Environment
This presentation is about our journey to “a better place”
8. 82010-3-17 Enterprise Data World Presentation
Examples of the Tension “Back Then”
“Our analysis generated
millions of dollars in new
value… but it took forever to
obtain and clean the data!”
“When I talk to customers I scroll
through 12 screens to find the info I
need…then I don’t know where to put
the info I capture!”
“Poor data, systems, and product
features are holding us back!”
Analytics
Team
Sales and
Marketing
Operations
Functions
9. 92010-3-17 Enterprise Data World Presentation
Executives
de-prioritized
fact-based decisions
Users didn’t trust
existing reports . . .
Examples of the Tension “Back Then”
So they kept demanding more
new reports
They created “homegrown”
reports with “surprising results”
“We don’t have
data to measure
customer value.”
“Why bother
asking for
something they
can’t do?”
“Before I can tell you
what I want from it, I
need to know what it
can do!”
“I THINK we
should try new
pricing….”
An unhealthy relationship
developed between users
and IT…
10. 102010-3-17 Enterprise Data World Presentation
An Ongoing Journey Towards Improvement
Suffering Led
to Interest
“What is MDM?”
“Should I care?”
…Which Led to Desire
for A Cure…
“How do we get there?”
…and to a
Never-Ending Journey
Towards Better MDM…
“Feeling better every day”
1 2 3Phase of
Journey
State of
Information
Landscape:
Results of
Current
State:
MDM Knowledge: Low
Data Situation: Messy
Brittle Systems
“We can’t change that!”
Complexity Increasing
“Here’s a workaround.”
Data Quality: Low
There Is a Cure!
Get Me to It!
Where To?
Let’s Stop the Bleeding …
…And Start with the Basics
MDM Function Setup
MDM Governance in Place
“Show me the Results!”
More confident decisions
More effective system
modernization
Reduced operational risk
More transparency
List of Pain Points Growing
Many Pain Points…
Our adventure got underway via three main phases…
“Targeted” Application to Gain:
11. 112010-3-17 Enterprise Data World Presentation
Table of Contents
Overview: Why Did We Embark on This Journey?2
Key Landmarks on the Journey3
A Crucial Turning Point: Moving to Execution4
Results: What Makes it All Worthwhile6
Lessons Learned5
Introduction1
12. 122010-3-17 Enterprise Data World Presentation
Key Landmarks
Stopping the Bleeding
Getting Buy-in & Alignment
Building the MDM Function
1
2
3
Key Landmark:
Since beginning this journey, we passed four key landmarks…
Turning the Corner to
“Targeted Application”
4
Some problems were SO painful and SO immediate, we needed to
apply MDM principles “surgically” even before we had a fully
fledged MDM function.
For example: building our customer profitability database, we
encountered and solved data quality issues and created a “safe
route” for data to enable this analysis…
A few early wins on analysis increased the appetite for even better
information
The organization recognized the role of MDM in improving
information AND agility
The IT department shifted towards more strategic imperatives
We built a new MDM function to define and institutionalize key best
practices
Established our policies, standards, governance, and stewardship roles
We reached a fork in the road: lots of new knowledge built, where
should we begin applying them?
The list of pain points is long: we realized that we can’t “boil the ocean”
We developed a “targeted approach” to applying MDM capabilities
Our first successful case: a new Data Warehouse for the growing
analytics team
What it was like:
13. 132010-3-17 Enterprise Data World Presentation
About “Stopping the Bleeding”
If the skills, experience, and
best practices are missing . . .
You can TRY and FAIL to
stop the bleeding . . .
This preserved existing problems with
data and created new ones!
New SQL instance with
consolidated data
Led to more misuse of data
(right data, wrong use)
Created self-serve
data access via Intranet
Data security issues arose“Home-grown data stores”
We had tried many times to fix data-related problems, without addressing the root issues well…
14. 142010-3-17 Enterprise Data World Presentation
Our First Landmark: “MDM Critical Care” to “Stop the Bleeding”
Diagnosis
1. Request for
Change to
Systems / Data
2. Workaround
Solution
3. Increased
Complexity
Systemic Issue:
“The Downward Spiral” 1. A Systemic Problem w/ Data Systems.
Frequent symptoms point to a larger
problem…with broader root causes.
Change the Way you Manage Data. Manage
data as distinct from systems or processes, but
keeping in mind the inter-relationships
1. Short-term: stop
the bleeding
Symptoms
(Systemic)
Immediate Issue:
“Toxic Data”
Symptoms
(Immediate)
High Stakes Analysis
Underway . . .
Data Quality Low
Strategic Growth At Risk
Prescriptions
2. Once bleeding
stops: deal with the
more fundamental
issues
We needed to stop the bleeding before we could resolve the systemic patterns causing them…
15. 152010-3-17 Enterprise Data World Presentation
Value of Analytics Increases
Our Second Landmark: Gaining Buy-In
Drivers of Buy-In How Buy-In Was “Formalized”
Strategy for Growth
Required more analytics
Required better systems, new
products, etc.
High profile “wins”
Desire for more
Creation of MDM
function
Establishment of
governance
Skills and knowledge
built and acquired
Dissatisfaction with Status Quo
We took the time to “connect the dots” in ways that built buy-in for an MDM function…
16. 162010-3-17 Enterprise Data World Presentation
Our Third Landmark: Building the MDM Function
Establishing the function included creation of 1) the MDM Team itself and 2) the data stewards…
1. Sales &
Marketing
2. Operations
Responsible for optimizing the benefits of data at the enterprise level
Reviews and advises on MDM consequences of IT/Business changes
Builds & Maintains MDM function, including:
1) Guiding principles
2) Essential capabilities
3) Documentation
3. Finance 4. Analytics
5. Information
Technology
Responsible for optimizing the benefits of data at the departmental level
Responsible for ensuring standard use of data according to MDM principles and standards
Ensures clear business requirements with regard to data elements, usage, business rules, and communication with IT
Data Stewards (Department Level)
MDM Team
(Enterprise Level)
17. 172010-3-17 Enterprise Data World Presentation
Our Fourth Landmark: Turning the Corner to “Targeted Application”
Application of MDM Principles
Pain-Point Identification
Prioritization of Pain Points
Initiative-Level Prioritization
Resource Allocation
Selection of Opportunities
Implementation standards and
frameworks
Tools
Integration with SDLC (System
Development Lifecycle) and Project
Management Standards
To get value from our investment, it was essential to move from theory to execution…
18. 182010-3-17 Enterprise Data World Presentation
Table of Contents
Overview: Why Did We Embark on This Journey?2
Key Landmarks on the Journey3
A Crucial Turning Point: Moving to Execution4
Lessons Learned5
Results: What Makes it All Worthwhile6
Introduction1
19. 192010-3-17 Enterprise Data World Presentation
Execution
At first, we were overwhelmed with choices . . .
Where to start? What destination first?
Prior to our MDMTransformation, we faced:
Multiple platforms
Numerouslocations of data
Limited documentation of:
– Data locations
– Data elements
– Relationships between elements
– Business rules
– Business purposes & users
– Data flow
– Existing documentation not always utilized
No holistic view of our customers
Extremely time-consuming to pull new information
Significanttension between business users,
analysts, and IT staff
Our Legacy Data Environment
20. 202010-3-17 Enterprise Data World Presentation
Our Solution: “Work Backwards from the Goal” of ROI on MDM Programs
Applied Case 1
Precondition: Application of MDM Principles in High ROI Ways . . .
Ultimate Goal = Return on Investments in MDM Programs
Case 2 Case 3 Case 4
How Do We “Bridge this Gap”?
21. 212010-3-17 Enterprise Data World Presentation
“Unpacking the Steps” on the Pathway to ROI . . .
Ultimate Goal = Return on Investment
Apply MDM Principles & Best Practices to High-Impact Use-Cases
Understand Problems it Solves, and Therefore the Value Proposition
Understand Concepts, Principles, Application, and High-Level Techniques to Measure Value
Organizational Commitment
Preconditions:
Principles
Best Practices
Governance Framework
Data Stewardship Roles and Processes
Develop Governance Capability: “How do we manage this?”
Pain-Point Identification
Prioritization of Pain Points
Initiative-Level Prioritization
Resource Allocation
Capability to Select & Execute on the Best Opportunities
1. Commitment
4. Execution
Implementation Standards and Frameworks
Tools
Integration with SDLC (Solution Development
Lifecycle) and Project Management Standards
2. Governance
3. Selection
22. 222010-3-17 Enterprise Data World Presentation
Challenges at Each Stage
Stage (Precondition) Challenges Solutions and Learnings
1. Organizational
Commitment
Without broad buy-in, implementation
will be “messy”
The link between MDM and business
results is not obvious to everyone. Some
people NEED specifics to believe in it…
Use specific examples to help people
understand the problems MDM
solves
Solve a problem right away, even if
this happens before a full-fledged
program exists
2. Develop Governance
Capability: “How do
We Manage This?”
There is a lot to learn
Governance requires ongoing
commitment
Governance alone doesn’t unlock results
Align governance with OTHER key
areas where standards already exist
Link with Knowledge Management
Maximize cross-functional breadth
3. Capability to Select &
Execute on the Best
Opportunity
Users often report pain points without
broader context
The business case for solving individual
pain points can be ambiguous
Look for projects with the biggest
business benefit AND most likely to
benefit from MDM
“Bundle” pain points into the
projects that would solve them
4. Apply MDM Principles
& Best Practices to
High-Impact Use-Cases
Some organizations try to jump right to
this point! (the myth of turnkey
application)
Even WITH preparation, this is a tough
transition to make
A good “toolkit” is essential
Facing specifics improves the
frameworks and standards also
Getting results maintains
momentum
23. 232010-3-17 Enterprise Data World Presentation
Our Decision:
To Focus our Limited MDM Resources on Building a Better Data Warehouse…
The Data Warehouse will provide consolidated data which enables better strategic decisions
Category Current State Future-State Benefits with Data Warehouse
High Stakes
Decisions
Decisions often made with
imperfect data
Better data (more accurate, relevant and holistic)
will enable better decisions
Efficiency of
Analysis
Analysis is time consuming
Involvement from IT necessary to
source data
Increased speed of analysis
More frequent updates available
Fewer resources needed per analysis
Analysts will not be accessing operational data,
reducing operational burden
Integration of
Sources
Inconsistent data limits ability to
use multiple data sources
Combined data will provide holistic view of
customer and business
Security Ad-hoc use of data exports is not
optimal
Analysts access operational data,
causing some risk to systems
Data Warehouse data will mask sensitive customer
data
Standard source will limit ad-hoc usage
Analysts will not be accessing operational data
Data Clarity Data sources are not documented
Inconsistent fields across data
sources
Greater clarity about data thanks to:
– Well documented sources
– Data management standards
– Data dictionaries
Benefits of the Data Warehouse
24. 242010-3-17 Enterprise Data World Presentation
Table of Contents
Overview: Why Did We Embark on This Journey?2
Key Landmarks on the Journey3
A Crucial Turning Point: Moving to Execution4
Lessons Learned5
Results: What Makes it All Worthwhile6
Introduction1
25. 252010-3-17 Enterprise Data World Presentation
Lesson Learned: Value of Buy-in
Buy-in from the Executive team was essential to moving forward
26. 262010-3-17 Enterprise Data World Presentation
Lesson Learned: An Iterative Approach Maintains Momentum
Time and Money Invested
Capabilities
↑
$
More Capabilities
↑
$
Building Capabilities
Applying Capabilities
Target high ROI application
Get value
Measure value
Communicate and promote
Target next
opportunity
Get value
Measure value
By taking an iterative approach that mixes building and applying capabilities, we’ve found it easier
to maintain momentum and buy-in…
MeasureableBenefitsUnlocked
27. 272010-3-17 Enterprise Data World Presentation
Lesson Learned: Importance of Capability Building
We had to put in place the basic infrastructure and skills before we could move forward
Develop Basic Knowledge
Data Stewards
Governance Policy, Standards, and Procedures
Develop Basic Skills
MDM Team
28. 282010-3-17 Enterprise Data World Presentation
Table of Contents
Overview: Why Did We Embark on This Journey?2
Key Landmarks on the Journey3
A Crucial Turning Point: Moving to Execution4
Lessons Learned5
Results: What Makes it All Worthwhile6
Introduction1
29. 292010-3-17 Enterprise Data World Presentation
A Better Pattern: Then vs. Now
Old Pattern
Via a
Long
Journey
New Pattern
MDM
Function ROI
Pain
Points
Business
Goals
Business
Results
Learnings
1. Request for
Change to
Systems / Data
2. Workaround
Solution
3. Increased
Complexity
“The Downward Spiral”
30. 302010-3-17 Enterprise Data World Presentation
Results: Enterprise-Level Progress!
Our strategy is working, analytics has become a strength, IT is more nimble, and profits are up!
Strategy
Area of Progress:
Analytics
Legacy
Modernization
Profit Growth
Growth strategy has gained traction, with new capabilities driving growth
in card sales volume, fee revenue, and profits
Confidence in organizational capacity for continued growth is up
Starting with construction of customer profitability analytics, the team has
unlocked tens of millions of dollars in new profit growth
Analytics team has grown from one person to six (and from a team to an
official function!)
We have begun phased legacy modernization that will enable more
strategic growth
Our MDM capabilities will be essential to this modernization initiative
Despite the economic crisis, we achieved record sales and profits this year
While regulatory uncertainty has increased this year, our analytic
capabilities are helping us to rapidly adapt…
31. 312010-3-17 Enterprise Data World Presentation
Continue the Journey with Us!
Architects of Fact-Based Decisions™
Art Garanich Jaime Fitzgerald
jfitzgerald@fitzgerald-analytics.com
917-846-3759
garanichart@cfna.com
216-362-3418
Our journey continues…we hope to stay in touch with you, our “fellow travelers,” to learn from
each other and improve results!
Editor's Notes
Next steps: Art by EOD or Wed am – notes
JF add his by Wed PM
Mark: why would you do this? Art: 1) get out of comfort zone, 2) Meet others who help us, 3) JF: get extra value via our thought-partnership
Walk through with equipment…
Art
JF to prepare succinct talking points re setup:
Our goals, there goals
Time Keeper = Shannon
Both presents:
CFNA: division of BS – highly regulated….
25 years with Enterprise work, now doing legacy modernization….
Have enough experience to share, but lots also to lean….
Art notes:
Focus on legacy modernization – have been working on this for a long time, have recognized the need for it
Over time it has become apparent how essential data management will be to enable legacy modernization….
Art’s background:
Started at Anicom out of colleage – built system for Aramco oil----global mainframe-based purchasing system. Used technology in ways “ahead of our time” – process maturity, standards, etc
After several years, were pursuing contract w Exxon, but Exxon was not comfortable hiring Anicom’s small size….EDW wanted relationship w Exxon, so they acruired this division of Anicom…..landed Exxon client
Career changed when Art become lead on new software development for – enterprise software handling end to end purchasing – “bill to payment” – Werton Stell = client. Leadership role managing customer and team – sseveray years, and very successful project.
Others companies were interested in this purchasing product, but wanted new architectures and software (it was COBOL mainframe) – EDW not able to find resources or partners to make this happen. Current was in IMS, goal was more relational….there was concern about what would happen to the division, so they turned it into internal resources /staff aug plus advisory consulting org (internal). Supported GM customers throughout NE OHIO…..With Mark Kula, did a Data Warehouse for Benjamin Moore Paints….
Y2K was crazy time…..lots of involvement there
~ 2000, opportunity arose in the credit card processing unit (where Art’s wife also works) – new role, new little about infrastructure needed in this function (telco, wide-area networks, txn processing)
After 1 year, EDW centralized this function – Art become title = Service Delivery Executive – in charge of managing relationship between EDS and clients…..”reach out person”
Got to the point where EDS was trying to sell capabilities they were not able to support…for example, bringing in outsourcing contracts, committing to SLAs not currently being achieved, NOT improving the infrastructure
JF to package this into useful document!
When Art was hired on, there was a guy named Bob Porter – visionary – of how a solution could be developed / should be developed – standards were remarkable – example: if you generated cobol code, 95% of code was “generated” – even on maintenance programs, 50-60% were
Met Tawney when he stared….she was part of that team that developed that capability….she was very technical back the…..the structure and standards…..
System consisted of 500-600 problems. If you had a problem….if happened in the same part of the program……so making the fix was very efficient…..
Realized that to be successful, you didn’t need to be a technical GURU….felt this became a potential liability…..Arlene: let’s get you into the system…..realized needed good people……trustworthy people…..realized the importantce of people skills…..staff loyalty and retention a passion and a strength….
EDS developed a career progression that fit this profile (art’s profile_ -== developed two career paths (technical vs leadership) --
Art presents this section
Art presents this section
For many years, complexity ruled
Increased commitment to analytics
Early efforts around Data Integration had “unintended consequences”
Documentation minimal – working with eyes haff closed/
IT/Biz Unhappy…..needed to get to a better place
Art: A few examples of the tension we saw back then….
1) First hit the quotes
2) Then talk about the Business Case for Fixing the “Analytics Pain Point”
Art: favorite Quote is on right…..
JF: comment on universality of these perspectives….
Art: bc of the pain….we realized we needed to change….
The suffering that led to interest…..
Tee up the “essential moment of truth….” – preview how seriously we’re going to address that in more detail in the project…..JF prep thoughts….
Jaime presents this section
JF riff is on causal links between these items….
Highlight examples: fixing data as precondition to customer profitability database
Early wins increased appetite
Diagnosis at BOTH the Enterprise/Holistic POV….
One key element of stopping the bleeding…..BUYING TIME….(buy time)
Compliance and risk
Focus is on Data Warehouse, business case is on Data Warehouse
Note upon return: messaging and communication around
1) Needed to build the function to gain the benefits…..
Now that it has been built, we have the ability to apply MDM principles in a structured way……
2) In terms of applying the function in high value ways, we had to start somewhere, and that first project is the Data Warehouse….
But over time we’ll be applying it to a variety of high-stakes legacy system modernization initiatives (where data management is a key precondition to success in these initiatives)
Key preconditions to application
Integrate principles and best practices and standards into existing process and core capabilities as an organization
If you have these core capabilities, you can plug MDM in. If you don’t have those capabilities, MDM will be harder to apply….
If you can’t read this, don’t worrym, bc it ‘s the same picture of current state before we started our journey….
Resource allocation =
Pain point ID…..a bundle are being solved
Note to JF – pain points back in the building
Stage 1 – buy in based in Paiin point
Point 3 – opportunity to be engaged
It will never be turnkey BC it’s a more of a discipline than a tangible application
Art
To maintain buy in we realized that while building the capabilities we needed to show the value….
Started w Data Warehouse…..iterative…..
Fortunately, leads nicely into next projects
Right people in right roles
Structure in place – find right people….