This presentation illustrates best practices in master data governance through a rich set of case studies. The presentation leverages seven years of in-depth experience in the field from the Competence Center Corporate Data Quality.
Data Quality as a Business Success FactorBoris Otto
The document discusses data quality as a business success factor. It provides two case studies: (1) At automotive supplier ZF Friedrichshafen AG, consistent and accurate master data is required for customer relationship management. (2) At Bayer CropScience, root causes of poor data quality were identified, including a lack of data quality training and heterogeneous data maintenance tools. The document emphasizes that corporate data quality management relates to business strategy and should follow a lifecycle approach. Benefits of improved data quality can include inventory savings and reduced costs of obsolete records.
Data Resource Management: Good Practices to Make the Most out of a Hidden Tre...Boris Otto
Management of the data resource in the industrial enterprise becomes a strategic capability in the digital age. The talk motivates data resource management, presents proven practices and outlines principles of modern data management approaches.
Data Governance from a Strategic Management PerspectiveBoris Otto
This document summarizes a presentation on data governance from a strategic management perspective. It discusses data governance as a dynamic capability that allows companies to address changing market needs by integrating, reconfiguring, gaining and releasing resources. It provides examples of how different companies have implemented and evolved their data governance over time, with some facing challenges integrating governance into daily operations. Effective double-loop learning and changing perceptions of data management are identified as important success factors for improving data governance maturity.
Modernizing the Enterprise Monolith: EQengineered Consulting Green PaperMark Hewitt
Are you an enterprise that recognizes the business liability inherent in the monolithic or otherwise dated enterprise software applications you have built? Does your technology represent an impediment to the needed agility and flexibility required to meet the needs of today’s business environment?
Historically, enterprise software development focused on an approach that incorporated all functionality into a single process, and replicated it across servers as additional capacity was required. Today, these large applications have become bloated and unmanageable as new features and functionality are added. And, as small changes are made to existing functionality, the requirements to update and redeploy the server-side application becomes an intractable juggernaut.
Forward-thinking organizations like Amazon and Netflix led the way toward agile processes, deconstructed software stacks, and efficient APIs. Both large and small organizations serious about embracing modern practices have followed by decoupling the front and back end of their enterprise applications, employing microservices and cloud technologies, and adopting agile methodologies. These very steps can serve to highlight additional technical deficits in old solutions and codebases, which in turn become stumbling blocks to modern development practices.
As these technology trends continue to evolve, how can your company keep pace and remain viable?
In this green paper, we discuss how CIOs, CTOs, and VPs of Engineering can lead the needed modernization with their counterparts in marketing and the business to ensure that their organizations remain competitive in today’s customer-driven and technology-led economy.
Key questions addressed include:
• Why is technical modernization vital for the business?
• What types of modernization projects are there?
• How does modernization fit into your organization?
This document provides an overview of data driven business models for manufacturing companies presented by Dr. Karan Menon. Some key points:
- Industrial Internet of Things enables new data collection capabilities that allow for more customized, optimized, and dynamically priced products and services.
- Manufacturing companies are evolving their business models from traditional product sales models to non-ownership models like pay-per-use, pay-per-outcome, and pay-per-output which provide new opportunities for growth.
- Tools like the morphological box can help companies transition to these new data-driven business models by mapping their current and envisioned future states to identify necessary changes and capabilities.
- Case studies of companies like C
International Data Spaces: Data Sovereignty for Business Model InnovationBoris Otto
This presentation given at the European Big Data Value Forum on November 13, 2018, in Vienna introduces International Data Spaces (IDS) as a reference architecture and implementation for data sovereignty. The IDS archiecture rests on usage control technologies and trusted computing environments and, thus, forms a strategic enabler for a fair data economy which respects the interests of the data owners.
Accurate BI &MDM Lead to successful Project Execution!Orchestra Networks
McDermott International’s global CIO describes why MDM is vital for accurate reporting, BI and big data analytics.
Presented at Gartner Enterprise Information and Master Data Management Summit, Las Vegas.
McDermott is an engineering and construction company focused on oil and gas field development projects. McDermott PARS (Project Analytics and Reporting System) supports project delivery with reports that integrate information from across the enterprise. At the heart of PARS: an MDM that manages the relationships–between domains, applications and time–required for accurate reporting and analytics.
Data Quality as a Business Success FactorBoris Otto
The document discusses data quality as a business success factor. It provides two case studies: (1) At automotive supplier ZF Friedrichshafen AG, consistent and accurate master data is required for customer relationship management. (2) At Bayer CropScience, root causes of poor data quality were identified, including a lack of data quality training and heterogeneous data maintenance tools. The document emphasizes that corporate data quality management relates to business strategy and should follow a lifecycle approach. Benefits of improved data quality can include inventory savings and reduced costs of obsolete records.
Data Resource Management: Good Practices to Make the Most out of a Hidden Tre...Boris Otto
Management of the data resource in the industrial enterprise becomes a strategic capability in the digital age. The talk motivates data resource management, presents proven practices and outlines principles of modern data management approaches.
Data Governance from a Strategic Management PerspectiveBoris Otto
This document summarizes a presentation on data governance from a strategic management perspective. It discusses data governance as a dynamic capability that allows companies to address changing market needs by integrating, reconfiguring, gaining and releasing resources. It provides examples of how different companies have implemented and evolved their data governance over time, with some facing challenges integrating governance into daily operations. Effective double-loop learning and changing perceptions of data management are identified as important success factors for improving data governance maturity.
Modernizing the Enterprise Monolith: EQengineered Consulting Green PaperMark Hewitt
Are you an enterprise that recognizes the business liability inherent in the monolithic or otherwise dated enterprise software applications you have built? Does your technology represent an impediment to the needed agility and flexibility required to meet the needs of today’s business environment?
Historically, enterprise software development focused on an approach that incorporated all functionality into a single process, and replicated it across servers as additional capacity was required. Today, these large applications have become bloated and unmanageable as new features and functionality are added. And, as small changes are made to existing functionality, the requirements to update and redeploy the server-side application becomes an intractable juggernaut.
Forward-thinking organizations like Amazon and Netflix led the way toward agile processes, deconstructed software stacks, and efficient APIs. Both large and small organizations serious about embracing modern practices have followed by decoupling the front and back end of their enterprise applications, employing microservices and cloud technologies, and adopting agile methodologies. These very steps can serve to highlight additional technical deficits in old solutions and codebases, which in turn become stumbling blocks to modern development practices.
As these technology trends continue to evolve, how can your company keep pace and remain viable?
In this green paper, we discuss how CIOs, CTOs, and VPs of Engineering can lead the needed modernization with their counterparts in marketing and the business to ensure that their organizations remain competitive in today’s customer-driven and technology-led economy.
Key questions addressed include:
• Why is technical modernization vital for the business?
• What types of modernization projects are there?
• How does modernization fit into your organization?
This document provides an overview of data driven business models for manufacturing companies presented by Dr. Karan Menon. Some key points:
- Industrial Internet of Things enables new data collection capabilities that allow for more customized, optimized, and dynamically priced products and services.
- Manufacturing companies are evolving their business models from traditional product sales models to non-ownership models like pay-per-use, pay-per-outcome, and pay-per-output which provide new opportunities for growth.
- Tools like the morphological box can help companies transition to these new data-driven business models by mapping their current and envisioned future states to identify necessary changes and capabilities.
- Case studies of companies like C
International Data Spaces: Data Sovereignty for Business Model InnovationBoris Otto
This presentation given at the European Big Data Value Forum on November 13, 2018, in Vienna introduces International Data Spaces (IDS) as a reference architecture and implementation for data sovereignty. The IDS archiecture rests on usage control technologies and trusted computing environments and, thus, forms a strategic enabler for a fair data economy which respects the interests of the data owners.
Accurate BI &MDM Lead to successful Project Execution!Orchestra Networks
McDermott International’s global CIO describes why MDM is vital for accurate reporting, BI and big data analytics.
Presented at Gartner Enterprise Information and Master Data Management Summit, Las Vegas.
McDermott is an engineering and construction company focused on oil and gas field development projects. McDermott PARS (Project Analytics and Reporting System) supports project delivery with reports that integrate information from across the enterprise. At the heart of PARS: an MDM that manages the relationships–between domains, applications and time–required for accurate reporting and analytics.
This document summarizes key findings from the 2018 22nd Annual 3PL Study. The study examines the current state of the third-party logistics (3PL) market through an email survey of shippers and 3PL/4PL providers from major industries globally. Special topics in the 2018 study include blockchain for supply chain, automation/digitization in transportation, and risk/resilience in shipper-3PL relationships. The study also explores contemporary issues like the logistics talent revolution.
Next Generation Customer Communications ManagementNuxeo
Each year, banks and insurers generate billions of customer communications with one simple goal: a more personalized experience for customers. And yet, they are being held back by legacy technologies that are expensive, outmoded, and cannot address new delivery channels or modern privacy and security regulations. Learn how to modernize your customer communications.
The document discusses how big data is revolutionizing manufacturing. It defines big data and describes how manufacturers can benefit from big data analysis. Big data can help manufacturers improve processes, ensure product quality and safety, eliminate waste, and collaborate better. The document also provides examples of how big data is used in manufacturing for applications like optimizing production processes, custom product design, quality assurance, and managing supply chain risks. It discusses common reasons why companies fail with big data initiatives and outlines the future road ahead, including implementing Hadoop storage platforms, taking a lean approach, and leveraging the Internet of Things.
Digital Enterprise Architecture: Four Elements Critical to Solution EnvisioningCognizant
For the digital enterprise, architecture of all varieties must evolve strategically in step with technological capabilities and business imperatives. Such a multidimensional approach includes automation, AI, analytics, big data management and digitization as a holistic phenomenon.
Implementar una estrategia eficiente de gobierno y seguridad del dato con la ...Denodo
Watch full webinar here: https://bit.ly/3lSwLyU
En la era de la explosión de la información repartida en distintas fuentes, el gobierno de datos es un componente clave para garantizar la disponibilidad, usabilidad, integridad y seguridad de la información. Asimismo, el conjunto de procesos, roles y políticas que define permite que las organizaciones alcancen sus objetivos asegurando el uso eficiente de sus datos.
La virtualización de datos forma parte de las herramientas estratégica para implementar y optimizar el gobierno de datos. Esta tecnología permite a las empresas crear una visión 360º de sus datos y establecer controles de seguridad y políticas de acceso sobre toda la infraestructura, independientemente del formato o de su ubicación. De ese modo, reúne múltiples fuentes de datos, las hace accesibles desde una sola capa y proporciona capacidades de trazabilidad para supervisar los cambios en los datos.
Le invitamos a participar en este webinar para aprender:
- Cómo acelerar la integración de datos provenientes de fuentes de datos fragmentados en los sistemas internos y externos y obtener una vista integral de la información.
- Cómo activar en toda la empresa una sola capa de acceso a los datos con medidas de protección.
- Cómo la virtualización de datos proporciona los pilares para cumplir con las normativas actuales de protección de datos mediante auditoría, catálogo y seguridad de datos.
The document discusses a value model created by Intel's Digital Health Group to help healthcare organizations discuss and measure the benefits of healthcare IT (HIT) investments. The model identifies seven value dials - quality of care, patient safety, patient access, physician/staff productivity, physician/staff satisfaction, revenue enhancement, and cost optimization - that HIT investments can impact. The model provides a framework for healthcare organizations to determine which objectives they want to achieve through HIT and how to measure progress towards those objectives using relevant key performance indicators. Using this model, organizations can better evaluate HIT investments and initiatives.
Understanding Content:Machine Learning & the Modern InsurerNuxeo
In today’s digital world, insurers are literally drowning in content. Accident photos, policy applications, statements, customer emails and SMS messages -- these are all content that underpins your business and your customer experience. Yet for most insurers, content remains a massive business challenge: difficult to find, hard to process, impossible to analyze, and oh so expensive.
Discover how AI and machine learning can be used to transform your content into a business advantage.
We'll share real examples of how other P&C and Health insurers are using these technologies to automate virtual claims, intelligently process policy applications, detect fraud, and even gain new insight into accident damage.
You will learn:
- How AI uniquely addresses your content challenges
- Why custom ML models deliver more value
- What “content lakes” are, and what value they add
- How to take your first steps with AI
This document contains confidential information about Bitrock S.r.l.'s services and cannot be copied or distributed without permission. Bitrock provides solutions for continuous intelligence in manufacturing through IoT data analytics. They use stream processing and artificial intelligence to provide real-time insights from machinery data. Their approach involves connecting devices, collecting and analyzing streaming data, designing machine learning models, applying them to processes, and scaling the system across operations.
Stéphane Mouton from CETIC presented on data analysis in the steel industry and challenges of Industry 4.0. CETIC has participated in research projects with steel companies to explore data analysis opportunities. While steel production data volumes are not huge, Industry 4.0 is creating new data sources from sensors. Managing semantic metadata, performing distributed analysis close to data sources, and addressing data variety are key challenges. CETIC is exploring frameworks for processing data in heterogeneous environments and automatic support for moving preprocessing between IoT and cloud environments.
Enterprise Architecture - An Introduction Daljit Banger
The Slides are from my session at "An Evening of Enterprise Architecture Awareness" held at theUniversity of Sussex Hosted by the BCS Local Chapter and facilitated by the BCS EA Specialist Group.
This presentation was held by Professor Christine Legner (HEC Lausanne) at the Swiss Day on November 8, 2017, in Lausanne, Switzerland. It addresses the need for organisations to think about data and its management in new ways, as many corporations engage in the digital and data-driven transformation of their business. It concludes with three recommendations: 1) assess data's business value and impact, 2) measure and improve data quality, and 3) democratize data and support data citizenship.
This document summarizes a presentation about how SSM Health established an effective data governance program through internal crowdsourcing. It describes how top-down or bottom-up approaches alone can fail, but integrating the two can succeed. SSM Health engaged a wide range of stakeholders to provide input and establish consensus on definitions, policies and metrics. They created an "Information Portfolio" knowledge base where subject matter experts could collaboratively define and standardize key terms and measures. This approach helped overcome challenges like inconsistent definitions and aligned data governance with strategic goals.
From Fragmentation to Integration: Data for the Changing Working LifePauli Forma
This document discusses the development of a new data platform for work life in Finland. It notes that digitalization is creating both new opportunities and challenges from increased data and new ways to utilize data. The goal of the platform is to integrate fragmented work life data through a single dashboard that covers new topics and utilizes big data. It will collect near real-time work life data from various sources for customers to access. The platform aims to facilitate collaboration between different stakeholders in occupational health to maximize benefits and ensure technology supports rather than shapes work life in unwanted ways.
Spare Part Manufacturing Company is looking for a Big Data analytics solution that will pull data from the Datalog of the server to determine various issues
Lean Production Meets Big Data: A Next Generation Use CaseDatameer
The document discusses how big data and analytics can help optimize business processes using lean principles. It provides an overview of lean production concepts like identifying and eliminating waste. Big data is presented as a new approach to gain insights from process data that can help pinpoint improvement opportunities. The speaker demonstrates how Datameer's software allows users to easily analyze data from multiple sources and measure key performance indicators to drive continuous process improvements.
How Insurers Fueled Transformation During a PandemicNuxeo
For many insurers, the past year has accelerated strategic investments to manage remote workforces, support virtual claims handling, and face off with FinTech upstarts.
In this webinar, we look at how leading insurers not only addressed the immediate challenges caused by global lockdowns but also found new efficiencies along the way. Get insights into some of the emerging technologies that are driving innovation in insurance, including the Cloud, artificial intelligence, and low-code. We also explore how these technologies reduce claims leakage while improving claims accuracy, employee productivity, and customer satisfaction.
Accelerate Digital Transformation with Data Virtualization in Banking, Financ...Denodo
Watch full webinar here: https://bit.ly/38uCCUB
Banking, Financial Services and Insurance (BFSI) organizations are globally accelerating their digital journey, making rapid strides with their digitization efforts, and adding key capabilities to adapt and innovate in the new normal.
Many companies find digital transformation challenging as they rely on established systems that are often not only poorly integrated, but also highly resistant to modernization without downtime. Hear how the BFSI industry is leveraging data virtualization that facilitates digital transformation via a modern data integration / data delivery approach to gain greater agility, flexibility, and efficiency.
In this joint live webinar session from Denodo and Wipro, you will learn:
- Industry key trends and challenges driving the digital transformation mandate and platform modernization initiatives
- Key concepts of Data Virtualization, and how it can enable BFSI customers to develop critical capabilities for real-time / near real-time data integration
- Success Stories on organizations who already use data virtualization to differentiate themselves from the competition
- Wipro’s role in helping enterprises define the business case, end-to-end services and operating model for the successful data virtualization implementations
Schedule a Discovery Session to learn more about Wipro and Denodo joint solutions for Banking, Financial Services, and Insurance.
Big Data Public-Private Forum_General PresentationBIG Project
The BIG project aims to establish a public-private forum on big data in Europe. It will bring together stakeholders from industry and research to develop a shared vision for big data in Europe over the next 5-10 years. The project will identify existing big data technologies, understand how big data can be applied across sectors, and develop roadmaps for specific sectors. It seeks to increase European competitiveness and position big data research priorities in the EU framework program Horizon 2020. The project is coordinated across four work packages covering management, strategy and operations, dissemination, and establishing the public-private forum.
Introduction to Data Governance
Seminar hosted by Embarcadero technologies, where Christopher Bradley presented a session on Data Governance.
Drivers for Data Governance & Benefits
Data Governance Framework
Organization & Structures
Roles & responsibilities
Policies & Processes
Programme & Implementation
Reporting & Assurance
This document summarizes key findings from the 2018 22nd Annual 3PL Study. The study examines the current state of the third-party logistics (3PL) market through an email survey of shippers and 3PL/4PL providers from major industries globally. Special topics in the 2018 study include blockchain for supply chain, automation/digitization in transportation, and risk/resilience in shipper-3PL relationships. The study also explores contemporary issues like the logistics talent revolution.
Next Generation Customer Communications ManagementNuxeo
Each year, banks and insurers generate billions of customer communications with one simple goal: a more personalized experience for customers. And yet, they are being held back by legacy technologies that are expensive, outmoded, and cannot address new delivery channels or modern privacy and security regulations. Learn how to modernize your customer communications.
The document discusses how big data is revolutionizing manufacturing. It defines big data and describes how manufacturers can benefit from big data analysis. Big data can help manufacturers improve processes, ensure product quality and safety, eliminate waste, and collaborate better. The document also provides examples of how big data is used in manufacturing for applications like optimizing production processes, custom product design, quality assurance, and managing supply chain risks. It discusses common reasons why companies fail with big data initiatives and outlines the future road ahead, including implementing Hadoop storage platforms, taking a lean approach, and leveraging the Internet of Things.
Digital Enterprise Architecture: Four Elements Critical to Solution EnvisioningCognizant
For the digital enterprise, architecture of all varieties must evolve strategically in step with technological capabilities and business imperatives. Such a multidimensional approach includes automation, AI, analytics, big data management and digitization as a holistic phenomenon.
Implementar una estrategia eficiente de gobierno y seguridad del dato con la ...Denodo
Watch full webinar here: https://bit.ly/3lSwLyU
En la era de la explosión de la información repartida en distintas fuentes, el gobierno de datos es un componente clave para garantizar la disponibilidad, usabilidad, integridad y seguridad de la información. Asimismo, el conjunto de procesos, roles y políticas que define permite que las organizaciones alcancen sus objetivos asegurando el uso eficiente de sus datos.
La virtualización de datos forma parte de las herramientas estratégica para implementar y optimizar el gobierno de datos. Esta tecnología permite a las empresas crear una visión 360º de sus datos y establecer controles de seguridad y políticas de acceso sobre toda la infraestructura, independientemente del formato o de su ubicación. De ese modo, reúne múltiples fuentes de datos, las hace accesibles desde una sola capa y proporciona capacidades de trazabilidad para supervisar los cambios en los datos.
Le invitamos a participar en este webinar para aprender:
- Cómo acelerar la integración de datos provenientes de fuentes de datos fragmentados en los sistemas internos y externos y obtener una vista integral de la información.
- Cómo activar en toda la empresa una sola capa de acceso a los datos con medidas de protección.
- Cómo la virtualización de datos proporciona los pilares para cumplir con las normativas actuales de protección de datos mediante auditoría, catálogo y seguridad de datos.
The document discusses a value model created by Intel's Digital Health Group to help healthcare organizations discuss and measure the benefits of healthcare IT (HIT) investments. The model identifies seven value dials - quality of care, patient safety, patient access, physician/staff productivity, physician/staff satisfaction, revenue enhancement, and cost optimization - that HIT investments can impact. The model provides a framework for healthcare organizations to determine which objectives they want to achieve through HIT and how to measure progress towards those objectives using relevant key performance indicators. Using this model, organizations can better evaluate HIT investments and initiatives.
Understanding Content:Machine Learning & the Modern InsurerNuxeo
In today’s digital world, insurers are literally drowning in content. Accident photos, policy applications, statements, customer emails and SMS messages -- these are all content that underpins your business and your customer experience. Yet for most insurers, content remains a massive business challenge: difficult to find, hard to process, impossible to analyze, and oh so expensive.
Discover how AI and machine learning can be used to transform your content into a business advantage.
We'll share real examples of how other P&C and Health insurers are using these technologies to automate virtual claims, intelligently process policy applications, detect fraud, and even gain new insight into accident damage.
You will learn:
- How AI uniquely addresses your content challenges
- Why custom ML models deliver more value
- What “content lakes” are, and what value they add
- How to take your first steps with AI
This document contains confidential information about Bitrock S.r.l.'s services and cannot be copied or distributed without permission. Bitrock provides solutions for continuous intelligence in manufacturing through IoT data analytics. They use stream processing and artificial intelligence to provide real-time insights from machinery data. Their approach involves connecting devices, collecting and analyzing streaming data, designing machine learning models, applying them to processes, and scaling the system across operations.
Stéphane Mouton from CETIC presented on data analysis in the steel industry and challenges of Industry 4.0. CETIC has participated in research projects with steel companies to explore data analysis opportunities. While steel production data volumes are not huge, Industry 4.0 is creating new data sources from sensors. Managing semantic metadata, performing distributed analysis close to data sources, and addressing data variety are key challenges. CETIC is exploring frameworks for processing data in heterogeneous environments and automatic support for moving preprocessing between IoT and cloud environments.
Enterprise Architecture - An Introduction Daljit Banger
The Slides are from my session at "An Evening of Enterprise Architecture Awareness" held at theUniversity of Sussex Hosted by the BCS Local Chapter and facilitated by the BCS EA Specialist Group.
This presentation was held by Professor Christine Legner (HEC Lausanne) at the Swiss Day on November 8, 2017, in Lausanne, Switzerland. It addresses the need for organisations to think about data and its management in new ways, as many corporations engage in the digital and data-driven transformation of their business. It concludes with three recommendations: 1) assess data's business value and impact, 2) measure and improve data quality, and 3) democratize data and support data citizenship.
This document summarizes a presentation about how SSM Health established an effective data governance program through internal crowdsourcing. It describes how top-down or bottom-up approaches alone can fail, but integrating the two can succeed. SSM Health engaged a wide range of stakeholders to provide input and establish consensus on definitions, policies and metrics. They created an "Information Portfolio" knowledge base where subject matter experts could collaboratively define and standardize key terms and measures. This approach helped overcome challenges like inconsistent definitions and aligned data governance with strategic goals.
From Fragmentation to Integration: Data for the Changing Working LifePauli Forma
This document discusses the development of a new data platform for work life in Finland. It notes that digitalization is creating both new opportunities and challenges from increased data and new ways to utilize data. The goal of the platform is to integrate fragmented work life data through a single dashboard that covers new topics and utilizes big data. It will collect near real-time work life data from various sources for customers to access. The platform aims to facilitate collaboration between different stakeholders in occupational health to maximize benefits and ensure technology supports rather than shapes work life in unwanted ways.
Spare Part Manufacturing Company is looking for a Big Data analytics solution that will pull data from the Datalog of the server to determine various issues
Lean Production Meets Big Data: A Next Generation Use CaseDatameer
The document discusses how big data and analytics can help optimize business processes using lean principles. It provides an overview of lean production concepts like identifying and eliminating waste. Big data is presented as a new approach to gain insights from process data that can help pinpoint improvement opportunities. The speaker demonstrates how Datameer's software allows users to easily analyze data from multiple sources and measure key performance indicators to drive continuous process improvements.
How Insurers Fueled Transformation During a PandemicNuxeo
For many insurers, the past year has accelerated strategic investments to manage remote workforces, support virtual claims handling, and face off with FinTech upstarts.
In this webinar, we look at how leading insurers not only addressed the immediate challenges caused by global lockdowns but also found new efficiencies along the way. Get insights into some of the emerging technologies that are driving innovation in insurance, including the Cloud, artificial intelligence, and low-code. We also explore how these technologies reduce claims leakage while improving claims accuracy, employee productivity, and customer satisfaction.
Accelerate Digital Transformation with Data Virtualization in Banking, Financ...Denodo
Watch full webinar here: https://bit.ly/38uCCUB
Banking, Financial Services and Insurance (BFSI) organizations are globally accelerating their digital journey, making rapid strides with their digitization efforts, and adding key capabilities to adapt and innovate in the new normal.
Many companies find digital transformation challenging as they rely on established systems that are often not only poorly integrated, but also highly resistant to modernization without downtime. Hear how the BFSI industry is leveraging data virtualization that facilitates digital transformation via a modern data integration / data delivery approach to gain greater agility, flexibility, and efficiency.
In this joint live webinar session from Denodo and Wipro, you will learn:
- Industry key trends and challenges driving the digital transformation mandate and platform modernization initiatives
- Key concepts of Data Virtualization, and how it can enable BFSI customers to develop critical capabilities for real-time / near real-time data integration
- Success Stories on organizations who already use data virtualization to differentiate themselves from the competition
- Wipro’s role in helping enterprises define the business case, end-to-end services and operating model for the successful data virtualization implementations
Schedule a Discovery Session to learn more about Wipro and Denodo joint solutions for Banking, Financial Services, and Insurance.
Big Data Public-Private Forum_General PresentationBIG Project
The BIG project aims to establish a public-private forum on big data in Europe. It will bring together stakeholders from industry and research to develop a shared vision for big data in Europe over the next 5-10 years. The project will identify existing big data technologies, understand how big data can be applied across sectors, and develop roadmaps for specific sectors. It seeks to increase European competitiveness and position big data research priorities in the EU framework program Horizon 2020. The project is coordinated across four work packages covering management, strategy and operations, dissemination, and establishing the public-private forum.
Introduction to Data Governance
Seminar hosted by Embarcadero technologies, where Christopher Bradley presented a session on Data Governance.
Drivers for Data Governance & Benefits
Data Governance Framework
Organization & Structures
Roles & responsibilities
Policies & Processes
Programme & Implementation
Reporting & Assurance
La calidad de los datos es fundamental en todas las etapas del ciclo de vida de la información, desde la captura de datos hasta la toma de decisiones basadas en el conocimiento generado a partir de esos datos. La gobernabilidad de datos ayuda a garantizar la calidad de los datos a lo largo de este ciclo.
The document discusses product information management (PIM) for HP Printing and Personal Systems. It outlines the challenges of managing vast amounts of product data across departments and systems. It then describes how a PIM solution could address these challenges by providing a single source of truth for product information through capabilities like data integration, governance and a centralized repository. The paper also provides details on how HP could implement a PIM architecture using a transactional hub model to manage master product data.
El documento presenta 10 claves para el éxito de un proyecto de Master Data Management (MDM). Estas incluyen: 1) definir claramente el proyecto MDM, 2) distinguir MDM de datos almacenados, 3) establecer gobierno de datos, 4) asegurar acceso a todos los datos, 5) descubrir metadatos, 6) limpiar datos, 7) crear registros maestros únicos, 8) entregar datos maestros correctos, 9) gestionar el cambio organizacional, y 10) calcular el retorno de la invers
Este documento trata sobre la importancia de la calidad de datos para las empresas. Muchas empresas ignoran los problemas con sus datos, como datos personales incorrectos de clientes, direcciones incorrectas y números de teléfono erróneos. La calidad de datos ofrece ventajas como información precisa para la toma de decisiones y ahorros de costos. Sin embargo, las empresas generalmente no logran eliminar la no calidad de datos por sí solas debido a mitos como que es demasiado caro o complejo. Se recomienda elegir un especialista local en calidad de datos.
Obtenga una visión unificada de los datos de su organización (MDM)PowerData
El documento habla sobre la gobernabilidad de datos y la necesidad de una solución de administración maestra de datos (MDM). Propone una plataforma MDM que permite acceder, descubrir, limpiar, mapear y entregar datos maestros de cualquier origen para satisfacer los requisitos de negocio como optimizar canales, aumentar ingresos y mejorar adaptabilidad. También discute cómo Informatica puede satisfacer los requisitos fundamentales de MDM como integración y calidad de datos para entregar datos maestros relevantes de manera o
The document discusses Express Scripts winning the 2012 Data Governance Best Practice Award. It provides details on the award, including that it was the third annual award and there were 10 submissions that were evaluated by a panel of specialists. Express Scripts was selected as the winner for its formalized data governance program that included data quality initiatives, a data stewardship program, and organizational structure to support data governance across the large company. The program has resulted in estimated cost avoidance of over $20 million in 2011 from improved data quality and governance.
Master Data Management - MDM - Pasos para implementar MDMJose Pla
¿Qué es MDM?
¿Qué consideraciones tener para implementar MDM?
¿Cómo implementar MDM?
Relación de MDM en el área de Inteligencia de Negocios - BI
Casos de Estudio
Data Governance in a Federated Organization - A Case Study of World Vision In...DATAVERSITY
This document provides an overview of World Vision's development of a data governance program within its federated organizational structure. World Vision is a global Christian humanitarian organization made up of multiple independent national entities. It established a Data Governance Office in 2008 to help govern data across these entities. The office developed a strategy and roadmap but then faced budget cuts during the 2008 financial crisis. It focused on governing child sponsorship data by bringing records together and complying with EU privacy laws. This incremental approach helped the program survive and establish governance working groups and a council. Lessons included starting small and building value, having a clear plan, and gradually expanding the program's scope through proven results.
Exposición sobre el tema Master Data Management (Administración de Datos Maestros) realizada por Adriana Rodriguez y Luis Fernando Ortiz para la clase de Modelado y Gestión de Información en la Especialización en Proyectos Informáticos de la Universidad Distrital Francisco José de Caldas. Bogotá, Colombia. Noviembre de 2010.
How to Implement Data Governance Best PracticeDATAVERSITY
This document provides an overview of a webinar on implementing data governance best practices. It discusses defining data governance best practices and assessing an organization's current practices against those best practices. Examples of best practices from different industries are provided. The document emphasizes communicating best practices in a non-threatening way and building best practices into daily operations. Key aspects covered include criteria for determining best practices, messages to convey to management, and best practices related to creating a best practices document.
This document discusses late binding in data warehousing and its importance for analytic agility. Late binding means delaying the binding of data to rules and vocabularies for as long as possible. This allows data to be used flexibly for different analyses without being rigidly structured early on. It also discusses the progression of analytic sophistication in healthcare and how late binding is needed to support more advanced predictive and prescriptive analytics. Maintaining a record of changes to data bindings over time helps enable retracing of analytic steps. While early binding may be suitable when rules/vocabularies are stable, late binding is generally preferable to maximize flexibility and adaptability for analytics.
Resultados del estudio realizado a nivel global por Allegro 234, sobre las marcas más "cool" y el gap que existe entre estas y el desempeño de las marcas de nuestras organizaciones.
Results of the global survey conducted by Allegro 234 to chose the coolest brands and understand the gaps with our organizations
Este documento proporciona instrucciones para un proyecto escolar sobre terremotos. Los estudiantes deben organizar la información en 13 secciones e incluir imágenes y mapas en una presentación. Se proporcionan recursos en línea sobre la estructura de la Tierra, placas tectónicas, causas de los terremotos, arquitectura antisísmica y más. El trabajo será evaluado en base al cumplimiento de los requisitos y la calidad de la presentación.
La Universidad Técnica Latinoamericana ofrece varias carreras de ingeniería y administración de empresas en tres locales en Santa Tecla. La UTLA se enfoca en una educación práctica a través de laboratorios equipados y pasantías. La universidad ofrece horarios flexibles los fines de semana.
Corporate Data Quality Management Research and Services OverviewBoris Otto
This presentation provides an overview of the research and services portfolio of the Business Engineering Institute (BEI) St. Gallen in the field of corporate data quality managemnet (CDQM). CDQM comprises topics such as data governance, data quality measurement, master data management, data architecture management etc. At the core of the research and service portfolio is the Competence Center Corporate Data Quality (CC CDQ). The CC CDQ is a consortium research project at the Institute of Information Management at the University of St. Gallen (IWI-HSG). Partner companies come from various industry and service sectors.
Enterprise Master Data Architecture: Design Decisions and OptionsBoris Otto
The enterprise-wide management of master data is a prerequisite for companies to meet strategic business
requirements such as compliance to regulatory requirements, integrated customer management, and global business process integration. Among others, this demands systematic design of the enterprise master data architecture. The current state-of-the-art, however, does not provide sufficient guidance for practitioners as it does not specify concrete design decisions they have to make and to the design options of which they can choose with regard to the master data architecture. This paper aims at contributing to this gap. It reports on the findings of three case studies and uses morphological analysis to structure design decisions and options for the management of an enterprise master data architecture.
Evolution of data governance excellencepatriziapesce
1) The document discusses data governance design options for large enterprises, including a line organization, staff organization per business unit, shared service center, and externalization models.
2) It provides two examples of how companies have implemented data governance: a high tech company that used a central function and a chemical company that used a shared service center for governance and operational responsibility.
3) Key principles for an effective governance design are discussed, including being global, shared, governing, service-oriented, managed, and empowered. The evolution from a shared service to outsourced data management processes is also covered.
EDF2013: Talk of European Data Innovator Award Winner: Michael Gorriz, CIO of...European Data Forum
1) The presentation discusses the structural changes facing the "old economy" as products and processes become increasingly digitized, requiring corporate IT to embrace new technologies like linked data and open data.
2) It explains the evolution of the internet towards linked data, from early networks to today's large amounts of data where integration and standards are important.
3) Linked data activities at Daimler are highlighted, including proof of concepts around factual search, content processes, and improving Mercedes-Benz's website by extracting vehicle data and attributes.
The document discusses trends in big data and analytics. It notes that continuous transformation is the new normal due to converging technology disruptors that create opportunities but also threaten business models. IBM's response is focused on its four key plays of cloud, big data, social and mobile. Harnessing all data requires shifting thinking and evolving approaches to leverage all information from all perspectives for all decisions across all departments. Initial big data efforts often focus on gaining insights from existing internal data sources. The document outlines five patterns resulting from high value big data initiatives such as exploring all big data to improve business knowledge or achieving a complete unified view of the customers.
This talk will unveil QIAGEN’s Biomedical Knowledge Base products, elucidating their structure and schema design optimized for complex data exploration and sophisticated question-answering in the biomedical sector.
Enterprise Data Management: Managing your Business’s Entire Data LifecycleNIXUnited
Speaker: Eugene Rudenko, AI Solutions Consultant at NIX United
https://bit.ly/3OfVz1h
- Learn about what data management is and why this process is a must-have for enterprises in 2022
- Learn how to assess the maturity level of data management and digital transformation in your organization
- Learn advice on how to determine between ready-made products of custom solutions
- Learn about the advantages and disadvantages in comparison of cloud vs. on-premise vs. hybrid solutions
- Learn about data management advantages by the example of our use case AWS-based BI Platform for Data Visualization and Marketing Insights
- Learn about necessary conditions to ensure secure data storage and data compliance standards
- Learn about the importance of high security for the workspace and inner channels of communication
Enterprise Data Management: Managing your Business’s Entire Data LifecycleErinDempsey17
AIIM FL Chapter webinar featuring Eugene Rudenko, NIX United
The amount of data businesses generate and use in their operational activities grows exponentially every year. Ideally, all the data should be stored, organized, and processed at a reasonable cost. Therefore, enterprise data management (EDM) is not a buzzword but a necessary component of modern business operations that want to transform data into an efficient tool and have an advantage over competitors. Competent data management is all about establishing a process that extracts the value from data, mitigates risk, and contributes to data-driven decisions. In addition, the well-established EDM is secured and increases the quality, integrity, and trustworthiness of the data used for business operations and reporting.
Suppose you have heard about data management but always wanted to understand what it is in a nutshell, its benefits, and most importantly, how to organize this process in your company or level up the existing data management process in your business. In that case, this slide deck is worth reviewing.
You will see the full potential of data management solutions, get meaningful advice from a seasoned expert about accelerating a business's digital transformation and frictionless building of EDM for your company proven with case studies.
You will learn about:
• What data management is and why this process is a must-have for enterprises in 2022
• How to assess the maturity level of data management and digital transformation in your organization
• Advice on how to determine between ready-made products of custom solutions
• Advantages and disadvantages in comparison of cloud vs. on-premise vs. hybrid solutions
• Data management advantages by the example of our use case AWS-based BI Platform for Data Visualization and Marketing Insights
• Necessary conditions to ensure secure data storage and data compliance standards
• The importance of high security for workspace and inner channels of communication
The document summarizes a presentation about managing information projects in the current environment. It outlines three key trends affecting information management: 1) the increasing amount of information being created, 2) advances in technology allowing greater storage and access, and 3) rising user expectations as a result. It also discusses challenges for project managers, such as performing pre-assessments to understand knowledge gaps, localizing information for different languages/cultures, and protecting intellectual property through digital rights management.
What Do I Know About My Customers - Human Inferece - Atos ConsultingDataValueTalk
Who is my customer, how does he behave? Where is he? Is my customer really who he says he is? Correct customer knowledge and up-to-date data that are of good quality is essential to companies. Especially when the economic outlook is not very positive.
Enabling a Bimodal IT Framework for Advanced Analytics with Data VirtualizationDenodo
Watch: https://bit.ly/2FLc5I2
Being able to maintain a well managed and curated Data Warehouse, along with keeping up with all of the demands of a very sophisticated consumer group can be a challenge. The new user wants access to data, they want to experiment, fail fast and if they do find usable insights/algorithms they want them productionized. This puts pressure on an IT organization and pushes them closer to a Bimodal operation where the regular IT processes that are highly curated, well defined and managed contrast sharply with the demands of the more sophisticated user.
In the recently published TDWI Best Practices Report ,“Data Management for Advanced Analytics”, Philip Russom, DM for Advanced Analytics some of these newer requirements for the more sophisticated user are discussed in some length. How can IT support traditional demands around traditional BI and Reporting, whilst enabling the business with more demand for data and Advanced Analytics in mind?
Attend and learn:
- How data virtualization enables this Bi-Modal approach to Data Management.
- How data virtualization enables compelling use cases for data management and advanced analytics
- How we can achieve this important balance with process and technology.
The document introduces information logistics and discusses Ricoh's approach. It provides:
1) An explanation of information logistics, focusing on optimizing information availability, retention time, and delivery to the right user.
2) An analysis framework that assesses availability and efficiency losses impacting knowledge worker productivity. Case studies show how analyzing processes can improve information flow.
3) Details on Ricoh's current propositions that implement information logistics principles through standardized processes and tools to efficiently deliver solutions.
Industrial Data Space - Why we need a European Initiative on Data SovereigntyThorsten Huelsmann
IDS stands for safer data exchange between companies where the producer of data remains the owner of the data and maintains sovereignty over the use of that data.
IDS Association aims to define the conditions and governance for a reference architecture and interfaces aiming at international standards.
Corporate Data Quality: Research and Services OverviewBoris Otto
The document provides an overview of corporate data quality research and services from the Competence Center Corporate Data Quality (CC CDQ) at the University of St. Gallen. It discusses how data quality is a success factor for business and introduces the CC CDQ's research focus areas, consortium of partner companies, and services which include assessments, strategy development, and knowledge sharing. The CC CDQ team leverages expertise in research and consulting to help organizations improve data quality management.
The document discusses a pilot data platform project at Vrije Universiteit Brussel. The goals of the pilot are to better support policy decisions, operational functioning, and business prospects through increased access to institutional data. Specifically, the pilot aims to gain insights into academic networks and partnerships and support data-driven internationalization strategies. The pilot will involve building a data warehouse from Pure data to enable more structured data provision, reusable dashboards, and increased data-driven decision making. It will utilize SQL Server, Power BI Desktop, and Power BI Service to generate reports and insights from the data.
The top trends changing the landscape of Information ManagementVelrada
The role of information and data in the private sector, and how employees and users interact with that information, is changing rapidly.
With endless buzzwords and hot topics, and a ream of new technologies and upgrades, it can be difficult for organisations to know where to begin or how it translates into actionable insight.
This document provides a brief biography of Dr. Basuki Rahmad and outlines his presentation on data governance maturity models. It includes his educational and professional background, areas of research focus, academic and professional activities, and professional associations. The presentation outline covers an overview of data governance, existing data governance maturity models, and the CMM data governance maturity model developed by Rahmad. It also identifies potential areas for further research related to data governance mechanisms, scope, and implementation.
Ing Lease Uk - The relationship between Risk & Compliance and Data Quality - ...DataValueTalk
ING Lease is a major European leasing company operating in 15 countries. It employs various committees and processes to manage data quality. Proper data ownership and consistent definitions are important to ensure accurate risk measurement and regulatory compliance. Incorrect, incomplete, or inconsistent data can negatively impact areas like credit decisions, capital requirements, and financial reporting.
This document provides an overview of Inawisdom and their intelligent document processing (IDP) capabilities. Key points include:
- Inawisdom is an AWS partner focused on data, AI, and machine learning. They have over 180 AWS certifications and help customers achieve their ML goals.
- Their IDP platform uses AWS services like Textract, Comprehend, and SageMaker to extract key data from documents through tasks like classification, extraction, and formatting.
- Case studies demonstrate how they have helped customers in industries like banking and insurance automate processes like underwriting and invoice processing through IDP to improve efficiency, accuracy, and insights.
Big Data Analytics in light of Financial Industry Capgemini
Big data and analytics have the potential to transform economies and competition by delivering new productivity growth. Effective use of big data can increase operating margins over 60% for retailers and save $300 billion in US healthcare and $250 billion in European public sector. Companies that improve decision making through big data have seen a 26% performance improvement over 3 years on average. Emerging technologies like self-driving cars will rely heavily on analyzing vast amounts of real-time sensor data.
Similar to Master Data Governance Best Practices (20)
This document discusses the evolution of data spaces from closed ecosystems to open ecosystems to federations of ecosystems. It defines key concepts of data spaces including their technological, business, and legal aspects. The document outlines an example data space in the mobility domain and describes the fundamentals of data spaces including roles, interactions, and activities. It analyzes how characteristics such as interoperability, sovereignty, and trust/security change as data spaces evolve from closed to open to federations. Finally, it poses questions about who will take on the federator role to coordinate ecosystems and what business models and regulatory implications this role may have.
Shared Digital Twins: Collaboration in EcosystemsBoris Otto
This presentation introduces the concept of shared digital Twins from a cusiness perspective and outlines recent technological developments for shared digital twin management.
Deutschland auf dem Weg in die DatenökonomieBoris Otto
Der Vortrag greift aktuelle Diskussionsstränge zwischen Wirtschaft, Wissenschaft und Politik auf und thematisiert u.a. die betriebswirtschaftliche, volkswirtschaftliche, informationstechnische und ethische Dimension der Datenökonomie.
Business mit Daten? Deutschland auf dem Weg in die smarte DatenwirtschaftBoris Otto
This presentation (in German) given at the "Tage der digitalen Technologien" on May 15, 2019, in Berlin addresses data ecosystems as an innovative institutional format for creating value out of shared data. Furthermore, the talk points to selected challenges in setting up data ecosystems.
International Data Spaces: Data Sovereignty and Interoperability for Business...Boris Otto
This presentation was held in a workshop session on IoT Business Models and Data Interoperability at the Max Planck Institute for Innovation and Competition in Munich on 8 October 2018. The presenation introduces the concept of business ecosystems and the role of data within the latter, then outlines the state of the art in terms of interoperability and sovereignty and finally sketches the IDS contribution.
This presentation was part of the IDS Webinar on Data Governance. It gives a brief overview of the history on Data Governance, describes how governing data has to be further developed in the era of business and data ecosystems, and outlines the contribution of the International Data Spaces Association on the topic.
Smart Data Engineering: Erfolgsfaktor für die digitale TransformationBoris Otto
Diese Präsentation wurde auf dem Strategieforum IoT auf Schloss Hohenkammer am 30.5.2018 vorgetragen und führt in die Herausforderungen im Datenmanagement im Internet der Dinge ein. Zudem werden Prinzipien des Smart Data Engineering erläutert.
IDS: Update on Reference Architecture and Ecosystem DesignBoris Otto
This presentation motivates the Industrial Data Space and gives an update on the IDS Reference Architecture Model as well as the related ecosystem. It sets data in the context of business model innovation and points out how the IDS Reference Architecture relates to alternative data architecture styles such as data lakes and blockchain technology, for example. The presentation was given at the IDSA Summit on March 22, 2018.
Datensouveränität in Produktions- und LogistiknetzwerkenBoris Otto
Dieser Vortrag motiviert Datensouveränität in Produktions- und Logistiknetzwerken. Datensouveränität ist die Fähigkeit zur Selbstbestimmung über das Wirtschaftsgut Daten - auch beim Austauschen der Daten in Unternehmensnetzwerken. Der Vortrag führt in die Architektur des Industrial Data Space ein, der einen virtuellen Datenraum für den souveränen Datenaustausch bildet. Der Vortrag schließt mit Anwendungsbeispielen und einer Diskussion des Beitrags für die Wissenschaft und die Praxis.
Digital Business Engineering am Fraunhofer ISSTBoris Otto
This presentation (in German) gives an overview about how Fraunhofer ISST supports digital transformation projects in various industries. It motivates Digital Business Engineering as a methodological framework and show-cases typical applications. The presentation was given at the Fraunhofer ISST 25th anniversary event at Zeche Zollern in Dortmund.
Der Vortrag leitet am Beispiel der Automobilindustrie in die wesentlichen Entwicklungen zur Digitalisierung von Industriebetrieben ein und stellt dabei die besondere Rolle der Daten und eines wirksamen Datenmanagements heraus. Abschließend gibt der Vortrag Empfehlungen zum Management der Digitalen Transformation.
Data Sovereignty - Call for an International EffortBoris Otto
This presentation will be given at the Digitisting Manufacturing in the G20 Conference on March 16, 2017, in Berlin, in the context of the workshop "Data Sovereignty in Global Value Networks".
This presentation was held at the 2nd Internet of Manufacturing Conference on February 7, 2017, in Munich, Germany. It addresses the need of a new kind of data management to cope with the requirements digital scenarios pose on the industrial enterprise. Motivated by examples, the talk outlines design principles for smart data management and concludes with two leading examples, namely the Industrial Data Space initiative and the Corporate Data League.
Industrial Data Space: Referenzarchitekturmodell für die DigitalisierungBoris Otto
Diese Präsentation auf der VDI Industrie 4.0 Tagung am 25.1.2017 in Düsseldorf gibt ein Update der Entwicklungen des Industrial Data Space. Schwerpunkte sind Datensouveränität, der Industrial Data Space als Bindeglied zwischen IoT-Cloud-Plattformen sowie der Referenz-Use-Case Logistik.
Industrial Data Space: Digitale Souveränität über DatenBoris Otto
Der Vortrag führt in Grundbegriffe der Datenökonomie ein und macht einen Vorschlag zur Definition des Begriffs der digitalen Souveränität. Zudem arbeitet der Vortrag heraus, welchen wichtigen Beitrag der Industrial Data Space zur Wahrung der digitalen Souveränität leistet.
The Industrial Data Space aims at establishing a virtual data space in which partners in business ecosystems can securely exchange and easily link their data assets. The presentation puts the Industrial Data Space in the context of recent developments in the area of Smart Service Welt and Industrie 4.0 and sketches a reference architecture model and functional software components. Furthermore, the presentation introduces the Industrial Data Space Association which institutionalizes the user requirements and drives standardization. The presentation was given at the Industry 4.0 session at MACH 2016 on April 14, 2016, in Birmingham, UK.
Industrial Data Space: Digital Sovereignty for Industry 4.0 and Smart ServicesBoris Otto
The document discusses the Industrial Data Space initiative, which aims to establish a trusted network for industrial data exchange. It outlines the role of data in Industry 4.0 and smart services, and describes the Industrial Data Space architecture, which focuses on digital sovereignty, security, and a decentralized federated approach. The Industrial Data Space is being developed through a research project and chartered association, with upcoming activities including further use cases, positioning in Europe, and joint preparation of usage and operating models.
Industrial Data Space: Referenzarchitektur für Data Supply ChainsBoris Otto
Dieser Vortrag stellt den Industrial Data Space als Referenz-Architektur für Data Supply Chains vor. Data Supply Chains sind vernetzte, unternehmensübergreifende Datenflüsse. Data Supply Chains sind Voraussetzung um hybride Leistungsangebote (Smart Services) einerseits und digitalisierte Leistungserstellung (Industrie 4.0) andererseits zu verbinden. Durch die effektive und effiziente Bewirtschaftung von Data Supply Chains erhöhen Unternehmen ihre Wettbewerbsfähigkeit. Der Industrial Data Space liefert hierzu die Blaupause, als Referenzarchitektur für die Datenökonomie.
Daten sind die strategische Ressource im digitalen Zeitalter. Der Industrial Data Space zielt darauf ab, den sicheren Austausch und die einfache Kombination von Daten für Unternehmen zu ermöglichen. Dadurch lassen sich smarte Dienstleistungen einfacher verwirklichen. Fraunhofer erarbeitet in einem vom Bundesministerium für Bildung und Forschung geförderten Projekt die Basis dazu und entwickelt ein Referenzarchitekturmodell für den Industrial Data Space, das in ausgewählten Use Cases pilotiert wird.
The Industrial Data Space is a strategic initiative driven by industry and supported by the German Federal Government. It aims at supporting the secure exchange and easy combination of data within ecosystems.
SATTA MATKA DPBOSS KALYAN MATKA RESULTS KALYAN CHART KALYAN MATKA MATKA RESULT KALYAN MATKA TIPS SATTA MATKA MATKA COM MATKA PANA JODI TODAY BATTA SATKA MATKA PATTI JODI NUMBER MATKA RESULTS MATKA CHART MATKA JODI SATTA COM INDIA SATTA MATKA MATKA TIPS MATKA WAPKA ALL MATKA RESULT LIVE ONLINE MATKA RESULT KALYAN MATKA RESULT DPBOSS MATKA 143 MAIN MATKA KALYAN MATKA RESULTS KALYAN CHART
The Most Inspiring Entrepreneurs to Follow in 2024.pdfthesiliconleaders
In a world where the potential of youth innovation remains vastly untouched, there emerges a guiding light in the form of Norm Goldstein, the Founder and CEO of EduNetwork Partners. His dedication to this cause has earned him recognition as a Congressional Leadership Award recipient.
AI Transformation Playbook: Thinking AI-First for Your BusinessArijit Dutta
I dive into how businesses can stay competitive by integrating AI into their core processes. From identifying the right approach to building collaborative teams and recognizing common pitfalls, this guide has got you covered. AI transformation is a journey, and this playbook is here to help you navigate it successfully.
Adani Group's Active Interest In Increasing Its Presence in the Cement Manufa...Adani case
Time and again, the business group has taken up new business ventures, each of which has allowed it to expand its horizons further and reach new heights. Even amidst the Adani CBI Investigation, the firm has always focused on improving its cement business.
The Role of White Label Bookkeeping Services in Supporting the Growth and Sca...YourLegal Accounting
Effective financial management is important for expansion and scalability in the ever-changing US business environment. White Label Bookkeeping services is an innovative solution that is becoming more and more popular among businesses. These services provide a special method for managing financial duties effectively, freeing up companies to concentrate on their main operations and growth plans. We’ll look at how White Label Bookkeeping can help US firms expand and develop in this blog.
Efficient PHP Development Solutions for Dynamic Web ApplicationsHarwinder Singh
Unlock the full potential of your web projects with our expert PHP development solutions. From robust backend systems to dynamic front-end interfaces, we deliver scalable, secure, and high-performance applications tailored to your needs. Trust our skilled team to transform your ideas into reality with custom PHP programming, ensuring seamless functionality and a superior user experience.
SATTA MATKA DPBOSS KALYAN MATKA RESULTS KALYAN CHART KALYAN MATKA MATKA RESULT KALYAN MATKA TIPS SATTA MATKA MATKA COM MATKA PANA JODI TODAY BATTA SATKA MATKA PATTI JODI NUMBER MATKA RESULTS MATKA CHART MATKA JODI SATTA COM INDIA SATTA MATKA MATKA TIPS MATKA WAPKA ALL MATKA RESULT LIVE ONLINE MATKA RESULT KALYAN MATKA RESULT DPBOSS MATKA 143 MAIN MATKA KALYAN MATKA RESULTS KALYAN CHART
1. Audi-Endowed Chair of
Supply Net Order Management
Best Practices in Master Data Governance
Prof. Dr. Boris Otto | Berlin, 2013/9/23
2. Audi-Endowed Chair of
Supply Net Order Management
Agenda
Master Data as a Business Success Factor
Five Principles for Master Data Governance
Outlook
Prof. Dr. Boris Otto | Berlin, 2013/9/23
2
3. Audi-Endowed Chair of
Supply Net Order Management
Bayer CropScience is a leader in the crop protection market
Prof. Dr. Boris Otto | Berlin, 2013/9/23
3
4. Audi-Endowed Chair of
Supply Net Order Management
Master data quality is a key prerequisite for business process
performance1
Data Object
“Product Hierarchy”
09
11
012
242
3938
Business
Area
Business
Field
Business
Segment
Active
Ingredient
Product
Group
Data not available
Data Quality Issues
Data not complete
Data not consistent
Business Process
Impact
Planning: Demand for active ingredients unknown
Revenue reporting: Revenue not transparent on country
Segmentation: Risk of poor portfolio planning
1) [EBNER/BRAUER 2011].
Prof. Dr. Boris Otto | Berlin, 2013/9/23
4
5. Audi-Endowed Chair of
Supply Net Order Management
Johnson & Johnson is a leading producer of consumer products
Franchises
Headquarter
Prof. Dr. Boris Otto | Berlin, 2013/9/23
Skin Care, Baby Care, Consumer Healthcare, OTC
Skillman, NJ (USA)
5
6. Audi-Endowed Chair of
Supply Net Order Management
In early 2008, Johnson & Johnson was suffering from poor
master data quality1
“Production was delayed
at manufacturing plants”
“Project Management did not
know what stage products are in”
Controlling
“Trucks were waiting
at the docks for
materials to be
activated”
Portfolio Management and New Product Introduction
Inbound Logistics
“Purchase
orders were
not ready on
time”
Production
Sales &
Distribution
“Defective data was
sent to GS1 US”
Procurement
Financial Accounting
“Customers were
invoiced wrong”
Other Support Processes
For less than 30 of products’ dimensions and weights, data was within the allowed 5 % error margin
1) [OTTO 2013].
Prof. Dr. Boris Otto | Berlin, 2013/9/23
6
7. Audi-Endowed Chair of
Supply Net Order Management
Master data quality drivers affect the entire company
Group
Division 1
Division 2
Division 3
Business units
Business
processes
Locations
Business units
Business
processes
Locations
Business units
Business
processes
Locations
Compliance to regulations
360 degree view of the customer
Integrated and automated business processes
“Single Source of the Truth”
Prof. Dr. Boris Otto | Berlin, 2013/9/23
7
8. Audi-Endowed Chair of
Supply Net Order Management
Master data quality evolves over time according to a “jigsaw”
pattern
Master data quality
Time
Project 1
Legend:
Project 2
Project 3
Master data quality issues.
Prof. Dr. Boris Otto | Berlin, 2013/9/23
8
9. Audi-Endowed Chair of
Supply Net Order Management
The case of Bayer CropScience illustrates the various data
quality issues companies have to deal with1
Employees
Data Maintenance
Training and education
inadequate
No integrated software
support
Various software
solutions in place
Data maintenance not
harmonized on global level
Data quality not integrated in
performance management systems
No data quality
metrics
No continuous data
quality monitoring
Master data can be edited in
target systems
No binding rules,
standards, operating
procedures
Too many local rules,
exceptions
Data Quality Management
Standards
No
“Data Governance”
Data
quality
issues
Missing business
responsibilities
Organization
1) [BRAUER 2009].
Prof. Dr. Boris Otto | Berlin, 2013/9/23
9
10. Audi-Endowed Chair of
Supply Net Order Management
Corporate Data Quality Management (CDQM)1 comprises six
key enablers
1) [OTTO ET AL. 2011].
Prof. Dr. Boris Otto | Berlin, 2013/9/23
10
11. Audi-Endowed Chair of
Supply Net Order Management
Agenda
Master Data as a Business Success Factor
Five Principles for Master Data Governance
Outlook
Prof. Dr. Boris Otto | Berlin, 2013/9/23
11
12. Audi-Endowed Chair of
Supply Net Order Management
Data Governance and Data Quality Management are closely
interrelated
is sub-goal of
Maximize
Data Value
supports
supports
is led by
Data
Governance
Maximize
Data Quality
is sub-function
Data
Management
of
Data Quality
Management
are object of
are object of
are object of
Data Assets
Legend:
Goal
Function
Data.
Prof. Dr. Boris Otto | Berlin, 2013/9/23
Source: Otto, B.: Data Governance, in: WIRTSCHAFTSINFORMATIK, 53, 4, 2011, S. 235-238.
12
13. Audi-Endowed Chair of
Supply Net Order Management
Data Governance effectiveness still varies widely today1
7.5
7.5
25.0
30.0
30.0
very good
good
mediocre
adequate
poor
1) [MESSERSCHMIDT/STÜBEN 2011].
NB: Figures are percentages.
Prof. Dr. Boris Otto | Berlin, 2013/9/23
13
14. Audi-Endowed Chair of
Supply Net Order Management
What issues does upper management see with regard to Data
Governance? The case of Syngenta
Business benefits
“Keep in mind to balance costs for double-handling on one hand and of high
discipline on the other.”
“Emphasize usability of MDM, its value.”
Organizational readiness
“Data owners and data stewards are terms people don‘t understand. Be
educational and promotive.”
“Organizational maturity differs in the divisions.”
Data Governance implementation and execution
“What’s the migration path? Are there intermediate staging gates?”
“Is it a journey or can one make a choice? Or both?”
“How to integrate this strategy into the program of next year?”
“How to integrate the 35,000 ft view with daily operations?”
NB: Selected quotes from a series of eight interviews with line managers conducted in October and November 2011.
Prof. Dr. Boris Otto | Berlin, 2013/9/23
14
15. Audi-Endowed Chair of
Supply Net Order Management
Five key principles lead to excellence in master data
governance
Build up a Data Governance Capability
Enter Data “First Time Right”
Capture Data at the Source
Measure to Manage
Scale Capabilities Globally
Prof. Dr. Boris Otto | Berlin, 2013/9/23
15
16. Audi-Endowed Chair of
Supply Net Order Management
Typically, Data Governance capabilities have first to be built up
NB: Based on data from eight cases (Bayer
CropScience, Corning Cable Systems, DB
Netz, Deutsche Telekom, Johnson &
Johnson, Robert Bosch, Syngenta, ZF)
Prof. Dr. Boris Otto | Berlin, 2013/9/23
16
17. Audi-Endowed Chair of
Supply Net Order Management
Note taken in a meeting with Johnson & Johnson on November
29, 2011, in Skillman, NJ
Prof. Dr. Boris Otto | Berlin, 2013/9/23
17
18. Audi-Endowed Chair of
Supply Net Order Management
The ideal lifecycle of Data Governance capabilities follows an
“S” curve
E
Founding Phase
„First Time Right“
Legend: E Effectiveness; A Amount of Activity.
Prof. Dr. Boris Otto | Berlin, 2013/9/23
Cleansing
A
18
19. Audi-Endowed Chair of
Supply Net Order Management
It is not a perfect world, though
E
E
Pattern I
E
Pattern II
Pattern III
5.
3.
4.
3.
2.
1.
2.
1.
4.
2.
3.
1.
A
2008
2009
2010
2011
1. CDM unit launched
2. Data creation workflow
3. DQ metrics launched
A
2008
1.
2.
3.
4.
2009
2010
2011
DG project launched
Address to board
DQ metrics launched
„Community“ approach
proposed
5. DG council launched
A
2007
2008
2009
2010
1. CDM unit launched
2. Progress report to the board
proposed
3. Inventory data quality
assessment
4. CDM reorganized
Legend: E Effectiveness; A Amount of Activity; CDM Corporate Data Management; DQ Data Quality; DG Data
Governance.
Prof. Dr. Boris Otto | Berlin, 2013/9/23
19
20. Audi-Endowed Chair of
Supply Net Order Management
Data quality must before assured before transaction MM01 is
executed …
Data Request
Prof. Dr. Boris Otto | Berlin, 2013/9/23
Data Quality
Check
Approval of
Data Quality
Creation of
Data Record
20
21. Audi-Endowed Chair of
Supply Net Order Management
… which is easier said than done ...
34+
Prof. Dr. Boris Otto | Berlin, 2013/9/23
21
22. Audi-Endowed Chair of
Supply Net Order Management
… with so many different stakeholders involved.
R&D
Production
Quality
Management
Planning
Purchasing
Financial
Accounting
Marketing
Controlling
Sales
Materials
Management
11+
Warehouse
Management
Prof. Dr. Boris Otto | Berlin, 2013/9/23
22
23. Audi-Endowed Chair of
Supply Net Order Management
Many companies assess the lifecycle costs of their master data
assets
Before use
200 EUR
2.500 EUR
-
-
1.500 EUR
2.400 EUR
2.861 EUR
(Creation of new code)
During use
175 EUR
(Code change)
After use
133 EUR
Prof. Dr. Boris Otto | Berlin, 2013/9/23
(3.000 CHF)
-
-
-
23
24. Audi-Endowed Chair of
Supply Net Order Management
Data must be captured at the source of the knowledge about it
1) [FOHRER 2012].
Prof. Dr. Boris Otto | Berlin, 2013/9/23
24
25. Audi-Endowed Chair of
Supply Net Order Management
A data quality index is an effective performance management
tool at Bayer CropScience
100
[%]
98
96
94
Evolution of Material Master Data Quality
92
90
Asia Pacific
88
Europe
Latin America
86
North America
84
11/2009
01/2010
03/2010
05/2010
07/2010
09/2010
11/2010
01/2011
1) [EBNER/BRAUER 2011].
Prof. Dr. Boris Otto | Berlin, 2013/9/23
25
26. Audi-Endowed Chair of
Supply Net Order Management
Johnson & Johnson has reached a six sigma data quality level1
Evolution of Material Master Data Quality
1) [OTTO 2013].
Prof. Dr. Boris Otto | Berlin, 2013/9/23
26
27. Audi-Endowed Chair of
Supply Net Order Management
Data Governance at Bosch engages different roles on different
organizational levels across the company1
Executive Management
report
corporate sector/
corporate department
Master Data
Owner A
Master Data
Owner X
Master Data
Officer
…
Master Data
Officer
…
Governance
Function
Governance
Function
Concepts
for a master data class
(specialist/organizational
level)
Concepts
Interdisciplinary
(MD Owner, IT, ..)
Overall responsibility
Responsibility
in relevant units (data
maintenance/ application)
Master Data Management
Steering Committee
working group /
competence team
IT Projects
Master data
class 1
…
e. g. Supplier master data
1) [HATZ 2008].
Prof. Dr. Boris Otto | Berlin, 2013/9/23
IT platforms, IT target systems
Master data
class N
Chart of accounts
27
28. Audi-Endowed Chair of
Supply Net Order Management
The “business case” for Data Governance and Corporate Data
Quality must take into account their very nature
Energy Networks
Prof. Dr. Boris Otto | Berlin, 2013/9/23
Highway Networks
Corporate Data Quality
28
29. Audi-Endowed Chair of
Supply Net Order Management
Agenda
Master Data as a Business Success Factor
Five Principles for Master Data Governance
Outlook
Prof. Dr. Boris Otto | Berlin, 2013/9/23
29
30. Audi-Endowed Chair of
Supply Net Order Management
Many enterprises are on the way towards a new corporate data
architecture
Data in the outer circles is of less
control, criticality, unambiguity…
Data in the outer circles is of higher “fuzziness”,
volume, change frequency…
“Nucleus Data”
(Customer master
data, product master
data etc.)
“Open Big Data”
(Tweets, social media
streams, sensor data etc.)
Prof. Dr. Boris Otto | Berlin, 2013/9/23
“Community Data”
(Geo-information,
GTIN, addresses
etc.)
30
31. Audi-Endowed Chair of
Supply Net Order Management
SAP and the CC CDQ have published a joint white paper
Prof. Dr. Boris Otto | Berlin, 2013/9/23
31
32. Audi-Endowed Chair of
Supply Net Order Management
The Competence Center Corporate Data Quality (CC CDQ)
channels “best practices” of market-leading companies
NB: Past and present partner companies.
Prof. Dr. Boris Otto | Berlin, 2013/9/23
32
33. Audi-Endowed Chair of
Supply Net Order Management
Your Speaker
Univ.-Prof. Dr. Ing. Boris Otto
TU Dortmund University
Audi-Endowed Chair of
Supply Net Order Management
LogistikCampus
Joseph-Fraunhofer-Straße 2-4
D-44227 Dortmund
Boris.Otto@tu-dortmund.de
Prof. Dr. Boris Otto | Berlin, 2013/9/23
33
34. Audi-Endowed Chair of
Supply Net Order Management
References
[BRAUER 2009]
B. BRAUER, Master Data Quality Cockpit at Bayer CropScience, 4. Workshop des Kompetenzzentrums Corporate Data Quality 2 (CC
CDQ2), Universität St. Gallen, Luzern, 2009.
[EBNER/BRAUER 2011]
V. EBNER, B. BRAUER: Fallstudie zum Führungssystem für Stammdatenqualität bei der Bayer CropScience AG. In: HMD - Praxis
der Wirtschaftsinformatik 48 (2011), S. 64-73.
[FOHRER 2012]
M. FOHRER, 2012. Driving Corporate Data Quality @ Hilti through the use of Consumer Technology. 10. CC CDQ3-Workshop.
Bregenz: Universität St. Gallen, Institut für Wirtschaftsinformatik.
[HATZ 2008]
A. HATZ, BOSCH Master data Management, 6. CC CDQ Workshop, St. Gallen, 2008.
[MESSERSCHMIDT/STÜBEN 2011]
M. MESSERSCHMIDT, J. STÜBEN: Verborgene Schätze: Eine internationale Studie zum Master-Data-Management,
PricewaterhouseCooopers AG, 2011
[OTTO ET AL. 2011]
B. OTTO, J. KOKEMÜLLER, A. WEISBECKER, D. GIZANIS: Stammdatenmanagement: Datenqualität für Geschäftsprozesse. In:
HMD - Praxis der Wirtschaftsinformatik 48 (2011), S. 5-16.
[OTTO 2013]
B. OTTO, 2013. On the Evolution of Data Governance in Firms: The Case of Johnson & Johnson Consumer Products North America.
In: SADIQ, S. (ed.) Handbook of Data Quality - Research and Practice. Berlin: Springer.
Prof. Dr. Boris Otto | Berlin, 2013/9/23
34