This presentation was held by Professor Christine Legner (HEC Lausanne) at the Swiss Day on November 8, 2017, in Lausanne, Switzerland. It addresses the need for organisations to think about data and its management in new ways, as many corporations engage in the digital and data-driven transformation of their business. It concludes with three recommendations: 1) assess data's business value and impact, 2) measure and improve data quality, and 3) democratize data and support data citizenship.
Machine learning techniques to improve data management and data quality - this presentation by Prof. Christine Legner and Martin Fadler summarizes research conducted in the Competence Center Corporate Data Quality (CC CDQ). It was held on February 13, 2019 at the DSAG Technologietage in Bonn.
Why not start with data sharing? Asset sharing reduces costs, improves utilization and sustainability. Only data assets are still managed in silos.
One of the few successful examples for data sharing is the CDQ Data Sharing Community for business partner data. This talk analyzes the approach and distinguishes two levels of data sharing:
(1) data knowledge sharing (semantics, rules and reference data), and
(2) data asset sharing (peer-based sharing of validated data)
Data sharing leads to higher data quality, lower data maintenance efforts, reduced risks and higher trust in data.
Machine learning for data management -Findings and implications for data management:
Machine learning has significant potential to improve data quality, but will at the same time disrupt data management processes and practices.
Data management processes will be redesigned:
- Highly repetitive and simple cases will be automated by machine, but human needs to intervene in more difficult and complex cases
--> Machine takes over prediction
--> Human judges output and confirms
There are some important prerequisites:
- Machine learning techniques depend on high quality data-->(Garbage in – garbage out)
- New roles and skills are required to explore and productize machine learning
Machine learning techniques to improve data management and data quality - presentation by Tobias Pentek and Martin Fadler from the Competence Center Corporate Data Quality. This presentation was presented during the Marcus Evans Event in Amsterdam 08.02.2019
Good systems development often depends on multiple data management disciplines that provide a solid foundation. One of these is metadata. While much of the discussion around metadata focuses on understanding metadata itself along with its associated technologies, this perspective often represents a typical tool-and-technology focus, which has not achieved significant results to date. A more relevant question when considering pockets of metadata is whether to include them in the scope of organizational metadata practices. By understanding what it means to include items in the scope of your metadata practices, you can begin to build systems that allow you to practice sophisticated ways to advance their data management and supported business initiatives. After a bit of practice in this manner you can position your organization to better exploit any and all metadata technologies in support of business strategy.
Takeaways:
Metadata value proposition: How to leverage metadata in support of your business strategy
Understanding foundational metadata concepts based on the DAMA DMBOK
Guiding principles & lessons learned
Data-Ed Online Presents: Data Warehouse StrategiesDATAVERSITY
Integrating data across systems has been a perpetual challenge. Unfortunately, the current technology-focused solutions have not helped IT to improve its dismal project success statistics. Data warehouses, BI implementations, and general analytical efforts achieve the same levels of success as other IT projects – approximately 1/3rd are considered successes when measured against price, schedule, or functionality objectives. The first step is determining the appropriate analysis approach to the data system integration challenge. The second step is understanding the strengths and weaknesses of various approaches. Turns out that proper analysis at this stage makes actual technology selection far more accurate. Only when these are accomplished can proper matching between problem and capabilities be achieved as the third step and true business value be delivered. This webinar will illustrate that good systems development more often depends on at least three data management disciplines in order to provide a solid foundation.
Takeaways:
Data system integration challenge analysis
Understanding of a range of data system-integration technologies including
Problem space (BI, Analytics, Big Data), Data (Warehousing, Vault, Cube) and alternative approaches (Virtualization, Linked Data, Portals, Meta-models)
Understanding foundational data warehousing & BI concepts based on the Data Management Body of Knowledge (DMBOK)
How to utilize data warehousing & BI in support of business strategy
Building an Effective Data & Analytics Operating Model A Data Modernization G...Mark Hewitt
This is the age of analytics—information resulting from the systematic analysis of data.
Insights gained from applying data and analytics to business allows large and small organizations across diverse industries—be it healthcare, retail, manufacturing, financial, or others—to identify new opportunities, improve core processes, enable continuous learning and differentiation, remain competitive, and thrive in an increasingly challenging business environment.
The key to building a data-driven practice is a Data and Analytics Operating Model (D&AOM) which enables the organization to establish standards for data governance, controls for data flows (both within and outside the organization), and adoption of appropriate technological innovations.
Success measures of a data initiative may include:
• Creating a competitive advantage by fulfilling unmet needs,
• Driving adoption and engagement of the digital experience platform (DXP),
• Delivering industry standard data and metrics, and
• Reducing the lift on service teams.
This green paper lays out the framework for building and customizing an effective data and analytics operating model.
Corporate Data Quality Management Research and Services OverviewBoris Otto
This presentation provides an overview of the research and services portfolio of the Business Engineering Institute (BEI) St. Gallen in the field of corporate data quality managemnet (CDQM). CDQM comprises topics such as data governance, data quality measurement, master data management, data architecture management etc. At the core of the research and service portfolio is the Competence Center Corporate Data Quality (CC CDQ). The CC CDQ is a consortium research project at the Institute of Information Management at the University of St. Gallen (IWI-HSG). Partner companies come from various industry and service sectors.
Machine learning techniques to improve data management and data quality - this presentation by Prof. Christine Legner and Martin Fadler summarizes research conducted in the Competence Center Corporate Data Quality (CC CDQ). It was held on February 13, 2019 at the DSAG Technologietage in Bonn.
Why not start with data sharing? Asset sharing reduces costs, improves utilization and sustainability. Only data assets are still managed in silos.
One of the few successful examples for data sharing is the CDQ Data Sharing Community for business partner data. This talk analyzes the approach and distinguishes two levels of data sharing:
(1) data knowledge sharing (semantics, rules and reference data), and
(2) data asset sharing (peer-based sharing of validated data)
Data sharing leads to higher data quality, lower data maintenance efforts, reduced risks and higher trust in data.
Machine learning for data management -Findings and implications for data management:
Machine learning has significant potential to improve data quality, but will at the same time disrupt data management processes and practices.
Data management processes will be redesigned:
- Highly repetitive and simple cases will be automated by machine, but human needs to intervene in more difficult and complex cases
--> Machine takes over prediction
--> Human judges output and confirms
There are some important prerequisites:
- Machine learning techniques depend on high quality data-->(Garbage in – garbage out)
- New roles and skills are required to explore and productize machine learning
Machine learning techniques to improve data management and data quality - presentation by Tobias Pentek and Martin Fadler from the Competence Center Corporate Data Quality. This presentation was presented during the Marcus Evans Event in Amsterdam 08.02.2019
Good systems development often depends on multiple data management disciplines that provide a solid foundation. One of these is metadata. While much of the discussion around metadata focuses on understanding metadata itself along with its associated technologies, this perspective often represents a typical tool-and-technology focus, which has not achieved significant results to date. A more relevant question when considering pockets of metadata is whether to include them in the scope of organizational metadata practices. By understanding what it means to include items in the scope of your metadata practices, you can begin to build systems that allow you to practice sophisticated ways to advance their data management and supported business initiatives. After a bit of practice in this manner you can position your organization to better exploit any and all metadata technologies in support of business strategy.
Takeaways:
Metadata value proposition: How to leverage metadata in support of your business strategy
Understanding foundational metadata concepts based on the DAMA DMBOK
Guiding principles & lessons learned
Data-Ed Online Presents: Data Warehouse StrategiesDATAVERSITY
Integrating data across systems has been a perpetual challenge. Unfortunately, the current technology-focused solutions have not helped IT to improve its dismal project success statistics. Data warehouses, BI implementations, and general analytical efforts achieve the same levels of success as other IT projects – approximately 1/3rd are considered successes when measured against price, schedule, or functionality objectives. The first step is determining the appropriate analysis approach to the data system integration challenge. The second step is understanding the strengths and weaknesses of various approaches. Turns out that proper analysis at this stage makes actual technology selection far more accurate. Only when these are accomplished can proper matching between problem and capabilities be achieved as the third step and true business value be delivered. This webinar will illustrate that good systems development more often depends on at least three data management disciplines in order to provide a solid foundation.
Takeaways:
Data system integration challenge analysis
Understanding of a range of data system-integration technologies including
Problem space (BI, Analytics, Big Data), Data (Warehousing, Vault, Cube) and alternative approaches (Virtualization, Linked Data, Portals, Meta-models)
Understanding foundational data warehousing & BI concepts based on the Data Management Body of Knowledge (DMBOK)
How to utilize data warehousing & BI in support of business strategy
Building an Effective Data & Analytics Operating Model A Data Modernization G...Mark Hewitt
This is the age of analytics—information resulting from the systematic analysis of data.
Insights gained from applying data and analytics to business allows large and small organizations across diverse industries—be it healthcare, retail, manufacturing, financial, or others—to identify new opportunities, improve core processes, enable continuous learning and differentiation, remain competitive, and thrive in an increasingly challenging business environment.
The key to building a data-driven practice is a Data and Analytics Operating Model (D&AOM) which enables the organization to establish standards for data governance, controls for data flows (both within and outside the organization), and adoption of appropriate technological innovations.
Success measures of a data initiative may include:
• Creating a competitive advantage by fulfilling unmet needs,
• Driving adoption and engagement of the digital experience platform (DXP),
• Delivering industry standard data and metrics, and
• Reducing the lift on service teams.
This green paper lays out the framework for building and customizing an effective data and analytics operating model.
Corporate Data Quality Management Research and Services OverviewBoris Otto
This presentation provides an overview of the research and services portfolio of the Business Engineering Institute (BEI) St. Gallen in the field of corporate data quality managemnet (CDQM). CDQM comprises topics such as data governance, data quality measurement, master data management, data architecture management etc. At the core of the research and service portfolio is the Competence Center Corporate Data Quality (CC CDQ). The CC CDQ is a consortium research project at the Institute of Information Management at the University of St. Gallen (IWI-HSG). Partner companies come from various industry and service sectors.
Big Data Analytics Architecture PowerPoint Presentation SlidesSlideTeam
Presenting this set of slides with name - Big Data Analytics Architecture Powerpoint Presentation Slides. This PPT deck displays twenty six slides with in depth research. Our topic oriented Big Data Analytics Architecture Powerpoint Presentation Slides presentation deck is a helpful tool to plan, prepare, document and analyse the topic with a clear approach. We provide a ready to use deck with all sorts of relevant topics subtopics templates, charts and graphs, overviews, analysis templates. Outline all the important aspects without any hassle. It showcases of all kind of editable templates infographs for an inclusive and comprehensive Big Data Analytics Architecture Powerpoint Presentation Slides presentation. Professionals, managers, individual and team involved in any company organization from any field can use them as per requirement.
Data governance isn't about data. It's about relationships. Who needs information and who has information? How does decision making relate to information, the systems that manage that information, and processes that create information? My wife doesn't spend her time thinking about databases, analytics, enterprise KPIs, and business performance, but she schools me whenever we talk about the daily challenges that confront me during data governance initiatives. These are some lessons from my marriage that are critical to effective data governance.
Data Management Meets Human Management - Why Words MatterDATAVERSITY
At Fifth Third Bank, about 450 people use data every day. They all start with Alation. But this wasn't always the case. In fact, getting hundreds of folks working in sync has been a monumental task.
Just ask Greg Swygart, VP of enterprise data at Fifth Third Bank. Greg has led data consumption and interaction efforts since adopting Alation. Currently he’s scaling out data literacy for Fifth Third, replicating data capabilities to all roles across the company.
Join Greg to learn how Fifth Third Bank moved from a command-and-control governance approach to non-invasive — and reaped the benefits. Greg will be followed by Bob Seiner, creator of Non-Invasive Data Governance, who will speak to data governance’s evolution, with an eye to what’s next.
In this webinar, you'll learn:
• About Fifth Third’s transition away from command-and-control governance
• How Fifth Third leverages Alation as its data marketplace for curation & consumption
• Why words matter when driving adoption
• About the data catalog — and its role in human management
Data Modelling 101 half day workshop presented by Chris Bradley at the Enterprise Data and Business Intelligence conference London on November 3rd 2014.
Chris Bradley is a leading independent information strategist.
Contact chris.bradley@dmadvisors.co.uk
This talk was given at SEMANTiCS 2014 in Leipzig. It gives an overview how to develop an enterprise linked data strategy around controlled vocabularies based on SKOS. It discusses how knowledge graphs based on SKOS can extended step by step due to the needs of the organization.
ADV Slides: When and How Data Lakes Fit into a Modern Data ArchitectureDATAVERSITY
Whether to take data ingestion cycles off the ETL tool and the data warehouse or to facilitate competitive Data Science and building algorithms in the organization, the data lake – a place for unmodeled and vast data – will be provisioned widely in 2020.
Though it doesn’t have to be complicated, the data lake has a few key design points that are critical, and it does need to follow some principles for success. Avoid building the data swamp, but not the data lake! The tool ecosystem is building up around the data lake and soon many will have a robust lake and data warehouse. We will discuss policy to keep them straight, send data to its best platform, and keep users’ confidence up in their data platforms.
Data lakes will be built in cloud object storage. We’ll discuss the options there as well.
Get this data point for your data lake journey.
This presentation was part of the IDS Webinar on Data Governance. It gives a brief overview of the history on Data Governance, describes how governing data has to be further developed in the era of business and data ecosystems, and outlines the contribution of the International Data Spaces Association on the topic.
Webinar: Decoding the Mystery - How to Know if You Need a Data Catalog, a Dat...DATAVERSITY
There’s a lot of confusion out there about the differences between a data catalog, a data dictionary and a business glossary, and it's not always easy to understand who needs which and why. Join Malcolm Chisholm, Ph.D., President of Data Millennium, and Amichai Fenner, Product Lead at Octopai, as they help decode the mystery. Spoiler alert: one of these enables collaboration across BI and IT, which is it?
Master data management (MDM) is defined as an application-independent process which describes, owns and manages core business data entities. The establishment of the MDM process is a Business Engineering (BE) tasks which requires organizational design. This paper reports on the results of a questionnaire survey among large enterprises aiming at delivering insight into what tasks and master data classes MDM organizations cover (“scope”) and how many people they employ (“size”). The nature of the study is descriptive, i.e. it allows for the identification of patterns and trends in organizing the MDM process.
Slides: Why You Need End-to-End Data Quality to Build Trust in KafkaDATAVERSITY
By adopting streaming architectures like Apache Kafka as a way to ingest and move large amounts of data very quickly, organizations are making major investments to access real-time data – and fundamentally changing how they do business. However, the advantages of Kafka can quickly be outweighed by the threat of poor Data Quality. Without Data Quality, all of the time and resources spent in building a new framework will fail to return the benefits that a Kafka platform offers.
Join Infogix’s Jeff Brown as he shares how data trust in your Kafka streaming framework is achievable when you put the proper validations and Data Quality components in place.
In this webinar, you’ll learn:
• Why organizations are moving to a streaming-based architecture
• What challenges are being faced when adopting Kafka messages as a new system-to-system communication method
• How to build data trust within your organization and its streaming framework
• Key directions on how to reconcile, balance, validate, and apply Data Quality to your streaming Data Architecture
• What customers are saying about their Kafka investment and how they’re working with Infogix to deliver data trust
ADV Slides: How to Improve Your Analytic Data Architecture MaturityDATAVERSITY
Many organizations are immature when it comes to data use. The answer lies in delivering a greater level of insight from data, straight to the point of need. Enter: machine learning.
In this webinar, William will look at categories of organizational response to the challenge across strategy, architecture, modeling, processes, and ethics. Machine learning maturity levels tend to move in harmony across these categories. As a general principle of maturity models, you can’t skip levels in any category, nor can you advance in one category well beyond the others.
Vis-à-vis ML, attaining and retaining momentum up the model is paramount for success. You will ascend the model through concerted efforts delivering business wins utilizing progressive elements of the model, and thereby increasing your machine learning maturity. The model will evolve. No plateaus are comfortable for long.
With ML maturity markers, sequencing, and tactics, this webinar provides a plan for how to build analytic Data Architecture maturity in your organization.
ADV Slides: The Evolution of the Data Platform and What It Means to Enterpris...DATAVERSITY
Thirty years is a long time for a technology foundation to be as active as relational databases. Are their replacements here?
In this webinar, we look at this foundational technology for modern Data Management and show how it evolved to meet the workloads of today, as well as when other platforms make sense for enterprise data.
Smarter businesses apply AI to learn and continuously evolve the way they work. To extract full value from AI, companies need data strategy that gives them access to all their data – no matter where it lives – in an environment that easily scales and applies the latest discovery technology including advanced analytics, visualization and AI. Learn how IBM Watson and Data provides all the tools companies need to embed AI, machine learning and deep learning in their business, while enabling professionals to gain the most from their data to drive smarter business and lead industry-changing transformations.
An Agile & Adaptive Approach to Addressing Financial Services Regulations and...Neo4j
Watch this webinar and learn how Neo4j and ICC Technology can help you remove risk from your data governance by improving the way you approach data lineage. We’ll cover some of the common approaches, driving regulations and biggest risks for banks and finances services.
-Find out how Data Lineage is becoming more complex for Banks and Financial Services companies
-Learn how a native-graph model can improve tracing data sources to targets as well as store transformations.
-Watch a demonstration on how you might approach regulations such as BCBS 239
Foundational Strategies for Trust in Big Data Part 2: Understanding Your DataPrecisely
Teams working on new initiatives whether for customer engagement, advanced analytics, or regulatory and compliance requirements need a broad range of data sources for the highest quality and most trusted results. Yet the sheer volume of data delivered coupled with the range of data sources including those from external 3rd parties increasingly precludes trust, confidence, and even understanding of the data and how or whether it can be used to make effective data-driven business decisions.
The second part of our webcast series on Foundation Strategies for Trust in Big Data provides insight into how Trillium Discovery for Big Data with its natively distributed execution for data profiling supports a foundation of data quality by enabling business analysts to gain rapid insight into data delivered to the data lake without technical expertise.
As per the PfMP Certification, it is critical to keep track of project progress in order to keep the timetable on track. Six elements included in comprehensive project reports are mentioned here.
how i managed to Develop a Analytics story for services about 4 years back. Contains
Maturity Model, Business Potential, Services Structures Areas that analytics can be applied to
20150108 create time stamp
Big Data Analytics: A New Business OpportunityEdward Curry
This talk introduces Big Data analytics and how they can be used to deliver value within organisations. The talk will cover the transformational potential of creating data value chains between different sectors. Developing a Big Data analytics capability will be discussed in addition to the challenges facing the emerging data economy.
Big Data Analytics Architecture PowerPoint Presentation SlidesSlideTeam
Presenting this set of slides with name - Big Data Analytics Architecture Powerpoint Presentation Slides. This PPT deck displays twenty six slides with in depth research. Our topic oriented Big Data Analytics Architecture Powerpoint Presentation Slides presentation deck is a helpful tool to plan, prepare, document and analyse the topic with a clear approach. We provide a ready to use deck with all sorts of relevant topics subtopics templates, charts and graphs, overviews, analysis templates. Outline all the important aspects without any hassle. It showcases of all kind of editable templates infographs for an inclusive and comprehensive Big Data Analytics Architecture Powerpoint Presentation Slides presentation. Professionals, managers, individual and team involved in any company organization from any field can use them as per requirement.
Data governance isn't about data. It's about relationships. Who needs information and who has information? How does decision making relate to information, the systems that manage that information, and processes that create information? My wife doesn't spend her time thinking about databases, analytics, enterprise KPIs, and business performance, but she schools me whenever we talk about the daily challenges that confront me during data governance initiatives. These are some lessons from my marriage that are critical to effective data governance.
Data Management Meets Human Management - Why Words MatterDATAVERSITY
At Fifth Third Bank, about 450 people use data every day. They all start with Alation. But this wasn't always the case. In fact, getting hundreds of folks working in sync has been a monumental task.
Just ask Greg Swygart, VP of enterprise data at Fifth Third Bank. Greg has led data consumption and interaction efforts since adopting Alation. Currently he’s scaling out data literacy for Fifth Third, replicating data capabilities to all roles across the company.
Join Greg to learn how Fifth Third Bank moved from a command-and-control governance approach to non-invasive — and reaped the benefits. Greg will be followed by Bob Seiner, creator of Non-Invasive Data Governance, who will speak to data governance’s evolution, with an eye to what’s next.
In this webinar, you'll learn:
• About Fifth Third’s transition away from command-and-control governance
• How Fifth Third leverages Alation as its data marketplace for curation & consumption
• Why words matter when driving adoption
• About the data catalog — and its role in human management
Data Modelling 101 half day workshop presented by Chris Bradley at the Enterprise Data and Business Intelligence conference London on November 3rd 2014.
Chris Bradley is a leading independent information strategist.
Contact chris.bradley@dmadvisors.co.uk
This talk was given at SEMANTiCS 2014 in Leipzig. It gives an overview how to develop an enterprise linked data strategy around controlled vocabularies based on SKOS. It discusses how knowledge graphs based on SKOS can extended step by step due to the needs of the organization.
ADV Slides: When and How Data Lakes Fit into a Modern Data ArchitectureDATAVERSITY
Whether to take data ingestion cycles off the ETL tool and the data warehouse or to facilitate competitive Data Science and building algorithms in the organization, the data lake – a place for unmodeled and vast data – will be provisioned widely in 2020.
Though it doesn’t have to be complicated, the data lake has a few key design points that are critical, and it does need to follow some principles for success. Avoid building the data swamp, but not the data lake! The tool ecosystem is building up around the data lake and soon many will have a robust lake and data warehouse. We will discuss policy to keep them straight, send data to its best platform, and keep users’ confidence up in their data platforms.
Data lakes will be built in cloud object storage. We’ll discuss the options there as well.
Get this data point for your data lake journey.
This presentation was part of the IDS Webinar on Data Governance. It gives a brief overview of the history on Data Governance, describes how governing data has to be further developed in the era of business and data ecosystems, and outlines the contribution of the International Data Spaces Association on the topic.
Webinar: Decoding the Mystery - How to Know if You Need a Data Catalog, a Dat...DATAVERSITY
There’s a lot of confusion out there about the differences between a data catalog, a data dictionary and a business glossary, and it's not always easy to understand who needs which and why. Join Malcolm Chisholm, Ph.D., President of Data Millennium, and Amichai Fenner, Product Lead at Octopai, as they help decode the mystery. Spoiler alert: one of these enables collaboration across BI and IT, which is it?
Master data management (MDM) is defined as an application-independent process which describes, owns and manages core business data entities. The establishment of the MDM process is a Business Engineering (BE) tasks which requires organizational design. This paper reports on the results of a questionnaire survey among large enterprises aiming at delivering insight into what tasks and master data classes MDM organizations cover (“scope”) and how many people they employ (“size”). The nature of the study is descriptive, i.e. it allows for the identification of patterns and trends in organizing the MDM process.
Slides: Why You Need End-to-End Data Quality to Build Trust in KafkaDATAVERSITY
By adopting streaming architectures like Apache Kafka as a way to ingest and move large amounts of data very quickly, organizations are making major investments to access real-time data – and fundamentally changing how they do business. However, the advantages of Kafka can quickly be outweighed by the threat of poor Data Quality. Without Data Quality, all of the time and resources spent in building a new framework will fail to return the benefits that a Kafka platform offers.
Join Infogix’s Jeff Brown as he shares how data trust in your Kafka streaming framework is achievable when you put the proper validations and Data Quality components in place.
In this webinar, you’ll learn:
• Why organizations are moving to a streaming-based architecture
• What challenges are being faced when adopting Kafka messages as a new system-to-system communication method
• How to build data trust within your organization and its streaming framework
• Key directions on how to reconcile, balance, validate, and apply Data Quality to your streaming Data Architecture
• What customers are saying about their Kafka investment and how they’re working with Infogix to deliver data trust
ADV Slides: How to Improve Your Analytic Data Architecture MaturityDATAVERSITY
Many organizations are immature when it comes to data use. The answer lies in delivering a greater level of insight from data, straight to the point of need. Enter: machine learning.
In this webinar, William will look at categories of organizational response to the challenge across strategy, architecture, modeling, processes, and ethics. Machine learning maturity levels tend to move in harmony across these categories. As a general principle of maturity models, you can’t skip levels in any category, nor can you advance in one category well beyond the others.
Vis-à-vis ML, attaining and retaining momentum up the model is paramount for success. You will ascend the model through concerted efforts delivering business wins utilizing progressive elements of the model, and thereby increasing your machine learning maturity. The model will evolve. No plateaus are comfortable for long.
With ML maturity markers, sequencing, and tactics, this webinar provides a plan for how to build analytic Data Architecture maturity in your organization.
ADV Slides: The Evolution of the Data Platform and What It Means to Enterpris...DATAVERSITY
Thirty years is a long time for a technology foundation to be as active as relational databases. Are their replacements here?
In this webinar, we look at this foundational technology for modern Data Management and show how it evolved to meet the workloads of today, as well as when other platforms make sense for enterprise data.
Smarter businesses apply AI to learn and continuously evolve the way they work. To extract full value from AI, companies need data strategy that gives them access to all their data – no matter where it lives – in an environment that easily scales and applies the latest discovery technology including advanced analytics, visualization and AI. Learn how IBM Watson and Data provides all the tools companies need to embed AI, machine learning and deep learning in their business, while enabling professionals to gain the most from their data to drive smarter business and lead industry-changing transformations.
An Agile & Adaptive Approach to Addressing Financial Services Regulations and...Neo4j
Watch this webinar and learn how Neo4j and ICC Technology can help you remove risk from your data governance by improving the way you approach data lineage. We’ll cover some of the common approaches, driving regulations and biggest risks for banks and finances services.
-Find out how Data Lineage is becoming more complex for Banks and Financial Services companies
-Learn how a native-graph model can improve tracing data sources to targets as well as store transformations.
-Watch a demonstration on how you might approach regulations such as BCBS 239
Foundational Strategies for Trust in Big Data Part 2: Understanding Your DataPrecisely
Teams working on new initiatives whether for customer engagement, advanced analytics, or regulatory and compliance requirements need a broad range of data sources for the highest quality and most trusted results. Yet the sheer volume of data delivered coupled with the range of data sources including those from external 3rd parties increasingly precludes trust, confidence, and even understanding of the data and how or whether it can be used to make effective data-driven business decisions.
The second part of our webcast series on Foundation Strategies for Trust in Big Data provides insight into how Trillium Discovery for Big Data with its natively distributed execution for data profiling supports a foundation of data quality by enabling business analysts to gain rapid insight into data delivered to the data lake without technical expertise.
As per the PfMP Certification, it is critical to keep track of project progress in order to keep the timetable on track. Six elements included in comprehensive project reports are mentioned here.
how i managed to Develop a Analytics story for services about 4 years back. Contains
Maturity Model, Business Potential, Services Structures Areas that analytics can be applied to
20150108 create time stamp
Big Data Analytics: A New Business OpportunityEdward Curry
This talk introduces Big Data analytics and how they can be used to deliver value within organisations. The talk will cover the transformational potential of creating data value chains between different sectors. Developing a Big Data analytics capability will be discussed in addition to the challenges facing the emerging data economy.
¿En qué se parece el Gobierno del Dato a un parque de atracciones?Denodo
Watch full webinar here: https://bit.ly/3Ab9gYq
Imagina llegar a un parque de atracciones con tu familia y comenzar tu día sin el típico plano que te permitirá planificarte para saber qué espectáculos ver, a qué atracciones ir, donde pueden o no pueden montar los niños… Posiblemente, no podrás sacar el máximo partido a tu día y te habrás perdido muchas cosas. Hay personas que les gusta ir a la aventura e ir descubriendo poco a poco, pero cuando hablamos de negocios, ir a la aventura puede ser fatídico...
En la era de la explosión de la información repartida en distintas fuentes, el gobierno de datos es clave para garantizar la disponibilidad, usabilidad, integridad y seguridad de esa información. Asimismo, el conjunto de procesos, roles y políticas que define permite que las organizaciones alcancen sus objetivos asegurando el uso eficiente de sus datos.
La virtualización de datos, herramienta estratégica para implementar y optimizar el gobierno del dato, permite a las empresas crear una visión 360º de sus datos y establecer controles de seguridad y políticas de acceso sobre toda la infraestructura, independientemente del formato o de su ubicación. De ese modo, reúne múltiples fuentes de datos, las hace accesibles desde una sola capa y proporciona capacidades de trazabilidad para supervisar los cambios en los datos.
En este webinar aprenderás a:
- Acelerar la integración de datos provenientes de fuentes de datos fragmentados en los sistemas internos y externos y obtener una vista integral de la información.
- Activar en toda la empresa una sola capa de acceso a los datos con medidas de protección.
- Cómo la virtualización de datos proporciona los pilares para cumplir con las normativas actuales de protección de datos mediante auditoría, catálogo y seguridad de datos.
Minggu-02 Big Data Business Model Maturity Index.pdfazkamuhammad11
Dalam rangka menyambur bisnis big data yang bisa dibelikan sebagai ambang penerapan sinergi yang memungkinkan big data berkecimpung pada brand clothing supermakepeace akan memberikan kunci sukses yang sangat ringan kepada rekan rekan yang mau terlibat terhadap brand supermakepeace ini, dengan itu kita bergerak secara konsiten=n denga brand cothing supermakepeace akan membuat kita masing2 berkecimpung di mirasa dan sekarang saya mengerjakan yang seharusnya tidak saya kerjakan karena dengan ini hidup terasa sangat hanya dengan mengantuk dengqan mengetik agar menemukan sedikit fb ads yang melanda kasih sayag dan dengan itu memvuat sayang;; dari pemerintah menjadi perintah alah siah botaaytn aaada acara berlingan iasng mata yang menebark
An Internet Market Research professional, I am trilingual (French, English, German); I am energetic, like to work in team, and in open, multinational and multicultural environments; I like to beat challenges, to discover new areas, to decipher new trends, to be on the sharp edge.
I am especially at ease with Digital Data, providing innovative concepts and accurate data elicitation. A member of the Digital Analytics Association and a Certified Web Analyst, I am working on missions in the Data Management and Analytics field. I am also open to any assignment as a Chief Data Officer, be it temporary or permanent.
Innovative Data Leveraging for Procurement AnalyticsTejari
This webinar will explore the types of problems and questions faced by procurement executives that can benefit most through the application of analytical solutions (e.g. innovation, strategic cost management, risk mitigation, etc.). In addition, we will cover the different forms of cognitive solutions that are emerging to drive real-time decision-making and predictive sourcing capabilities.
Operationalizing Customer Analytics with Azure and Power BICCG
Many organizations fail to realize the value of data science teams because they are not effectively translating the analytic findings produced by these teams into quantifiable business results. This webinar demonstrates how to visualize analytic models like churn and turn their output into action. Senior Business Solution Architect, Mike Druta, presents methods for operationalizing analytic models produced by data science teams into a repeatable process that can be automated and applied continuously using Azure.
Expert data analytics prove to be highly transformative when applied in context to corporate business strategies.
This webinar covers various approaches and strategies that will give you a detailed insight into planning and executing your Data Analytics projects.
Implementar una estrategia eficiente de gobierno y seguridad del dato con la ...Denodo
Watch full webinar here: https://bit.ly/3lSwLyU
En la era de la explosión de la información repartida en distintas fuentes, el gobierno de datos es un componente clave para garantizar la disponibilidad, usabilidad, integridad y seguridad de la información. Asimismo, el conjunto de procesos, roles y políticas que define permite que las organizaciones alcancen sus objetivos asegurando el uso eficiente de sus datos.
La virtualización de datos forma parte de las herramientas estratégica para implementar y optimizar el gobierno de datos. Esta tecnología permite a las empresas crear una visión 360º de sus datos y establecer controles de seguridad y políticas de acceso sobre toda la infraestructura, independientemente del formato o de su ubicación. De ese modo, reúne múltiples fuentes de datos, las hace accesibles desde una sola capa y proporciona capacidades de trazabilidad para supervisar los cambios en los datos.
Le invitamos a participar en este webinar para aprender:
- Cómo acelerar la integración de datos provenientes de fuentes de datos fragmentados en los sistemas internos y externos y obtener una vista integral de la información.
- Cómo activar en toda la empresa una sola capa de acceso a los datos con medidas de protección.
- Cómo la virtualización de datos proporciona los pilares para cumplir con las normativas actuales de protección de datos mediante auditoría, catálogo y seguridad de datos.
Enterprises are faced by information overload. Big data appears as an opportunity, but has no relevance until enterprises can put it in context of their activities, processes, and organizations, Applying MDM principles to Big Data is therefore an opportunity that enterprises should target.
This presentation covers the following topics :
- what is MDM and Information Management
- what is Big Data and what are the use cases
- why and how Big Data can take advantage of MDM ? why and how MDM can take advantage of Big Data ?
How to Capitalize on Big Data with Oracle Analytics CloudPerficient, Inc.
The average age of a company listed on the S&P 500 has fallen from almost 60 years old in the 1950s to less than 20 years old today. Innovative companies that are willing to embrace transformative technologies make the list today, while businesses that are hesitant to embrace change risk becoming obsolete.
Innovators use big data solutions as a competitive advantage to increase revenue, reduce cost, and improve cash flow. Turn big data into actionable insights with Oracle Analytics Cloud.
We identified the big data opportunities in front of you and how to take advantage of them:
-Big data and its architecture
-Why a big data strategy is imperative to remaining relevant
-How Oracle Analytics Cloud can help you connect people, places, data, and systems to fundamentally change how you analyze, understand, and act on information
Similar to Managing Data as a Strategic Resource – Foundation of the Digital and Data-Driven Enterprise (20)
Show drafts
volume_up
Empowering the Data Analytics Ecosystem: A Laser Focus on Value
The data analytics ecosystem thrives when every component functions at its peak, unlocking the true potential of data. Here's a laser focus on key areas for an empowered ecosystem:
1. Democratize Access, Not Data:
Granular Access Controls: Provide users with self-service tools tailored to their specific needs, preventing data overload and misuse.
Data Catalogs: Implement robust data catalogs for easy discovery and understanding of available data sources.
2. Foster Collaboration with Clear Roles:
Data Mesh Architecture: Break down data silos by creating a distributed data ownership model with clear ownership and responsibilities.
Collaborative Workspaces: Utilize interactive platforms where data scientists, analysts, and domain experts can work seamlessly together.
3. Leverage Advanced Analytics Strategically:
AI-powered Automation: Automate repetitive tasks like data cleaning and feature engineering, freeing up data talent for higher-level analysis.
Right-Tool Selection: Strategically choose the most effective advanced analytics techniques (e.g., AI, ML) based on specific business problems.
4. Prioritize Data Quality with Automation:
Automated Data Validation: Implement automated data quality checks to identify and rectify errors at the source, minimizing downstream issues.
Data Lineage Tracking: Track the flow of data throughout the ecosystem, ensuring transparency and facilitating root cause analysis for errors.
5. Cultivate a Data-Driven Mindset:
Metrics-Driven Performance Management: Align KPIs and performance metrics with data-driven insights to ensure actionable decision making.
Data Storytelling Workshops: Equip stakeholders with the skills to translate complex data findings into compelling narratives that drive action.
Benefits of a Precise Ecosystem:
Sharpened Focus: Precise access and clear roles ensure everyone works with the most relevant data, maximizing efficiency.
Actionable Insights: Strategic analytics and automated quality checks lead to more reliable and actionable data insights.
Continuous Improvement: Data-driven performance management fosters a culture of learning and continuous improvement.
Sustainable Growth: Empowered by data, organizations can make informed decisions to drive sustainable growth and innovation.
By focusing on these precise actions, organizations can create an empowered data analytics ecosystem that delivers real value by driving data-driven decisions and maximizing the return on their data investment.
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...John Andrews
SlideShare Description for "Chatty Kathy - UNC Bootcamp Final Project Presentation"
Title: Chatty Kathy: Enhancing Physical Activity Among Older Adults
Description:
Discover how Chatty Kathy, an innovative project developed at the UNC Bootcamp, aims to tackle the challenge of low physical activity among older adults. Our AI-driven solution uses peer interaction to boost and sustain exercise levels, significantly improving health outcomes. This presentation covers our problem statement, the rationale behind Chatty Kathy, synthetic data and persona creation, model performance metrics, a visual demonstration of the project, and potential future developments. Join us for an insightful Q&A session to explore the potential of this groundbreaking project.
Project Team: Jay Requarth, Jana Avery, John Andrews, Dr. Dick Davis II, Nee Buntoum, Nam Yeongjin & Mat Nicholas
StarCompliance is a leading firm specializing in the recovery of stolen cryptocurrency. Our comprehensive services are designed to assist individuals and organizations in navigating the complex process of fraud reporting, investigation, and fund recovery. We combine cutting-edge technology with expert legal support to provide a robust solution for victims of crypto theft.
Our Services Include:
Reporting to Tracking Authorities:
We immediately notify all relevant centralized exchanges (CEX), decentralized exchanges (DEX), and wallet providers about the stolen cryptocurrency. This ensures that the stolen assets are flagged as scam transactions, making it impossible for the thief to use them.
Assistance with Filing Police Reports:
We guide you through the process of filing a valid police report. Our support team provides detailed instructions on which police department to contact and helps you complete the necessary paperwork within the critical 72-hour window.
Launching the Refund Process:
Our team of experienced lawyers can initiate lawsuits on your behalf and represent you in various jurisdictions around the world. They work diligently to recover your stolen funds and ensure that justice is served.
At StarCompliance, we understand the urgency and stress involved in dealing with cryptocurrency theft. Our dedicated team works quickly and efficiently to provide you with the support and expertise needed to recover your assets. Trust us to be your partner in navigating the complexities of the crypto world and safeguarding your investments.
Innovative Methods in Media and Communication Research by Sebastian Kubitschk...
Managing Data as a Strategic Resource – Foundation of the Digital and Data-Driven Enterprise
1. Christine Legner
Professor of
Information Systems &
Academic Director CC CDQ
HEC Lausanne
Managing Data as a Strategic Resource –
Foundation of the Digital and Data-Driven
Enterprise
Swiss Data Day – November 8, 2017
2. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 2
The Competence Center Corporate Data Quality (CC CDQ)
is an expert community and research consortium
2006
Foundation
+30
Members
+50
CC CDQ
Workshops
+1500
Contacts within
CDQ community
+100
Bilateral Projects
Consortium research is being conducted in association between research institutions and companies
NB: Overview comprises both current and former partner companies
3. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 3
Effective data management –
foundation of the digital and data-driven enterprise
GOALS ENABLERS
DATA
STRATEGY
PEOPLE, ROLES &
RESPONSIBILITIES
PROCESSES &
METHODS
DATA
LIFECYCLE
DATA
APPLICATIONS
DATA
ARCHITECTURE
PERFORMANCE
MANAGEMENT
BUSINESS
CAPABILITIES
DATA
MANAGEMENT
CAPABILITIES
RESULTS
BUSINESS
VALUE
DATA
EXCELLENCE
The CDQ Data Excellence Model https://cc-cdq.ch/data-excellence-model
4. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 4
Agenda
1. The changing role of data – data as a strategic resource
2. Real-world challenges in the digital and data-driven enterprise
3. Conclusion
5. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 5
Data is a valuable resource – not only for tech giants!
http://www.economist.com/news/leaders/
21721656-data-economy-demands-new-
approach-antitrust-rules-worlds-most-valuable-
resource
“Data is becoming the new raw
material of business: an economic
input almost on par with capital and
labor. Every day I wake up and ask
how can I flow data better, manage
data better, analyze data better.”
Rollin Ford
Chief Administrative Officer
Walmart
6. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 6
Accessed from https://www.amazon.com/adidas-miCoach-G83963-Smart-Ball/dp/B00L7R2CWO on 2016-06-22
Digital and data-driven business models
7. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 7
Transformation towards the digital and data-driven enterprise
leads to the understanding of data as a strategic resource
Data-driven enterprise (Provost & Fawcett 2013; Davenport, 2014)
Goals:
• maximize the use of data and analytics
• promote data-driven and fact-based management
approaches
Priorities:
• leverage BI and analytics for real-time decisions
• explore big data platforms and advanced analytics
New roles and stakeholders:
• Chief Data Officer, data scientists, BI experts...
Digital transformation (Matt et al. 2015; Westerman et al. 2014)
Goals:
• use of digital technologies to radically improve
performance and reach of the enterprise
Priorities:
• digital business models and products/services
• operational excellence in existing business
processes
• digital customer experience and interaction
New roles and stakeholders:
• Chief Digital Officer, digital initiatives, …
Two complementary (yet overlapping) trends
It is all about data!
8. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 8
Traditional data management mainly focuses on operational
business processes
Based on Schierning (2016): Digitalization - Challenges and Opportunities for Product Based on Information Management. Presented at the 48th CC CDQ Workshop on February 25th 2016
Company
Source Produce Distribute
Demand
Order Fulfillment Cycle
(fulfill the demand)
Product/
Service
Traditional focus:
operational excellence
9. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 9
Digital and data-driven businesses rely on closed information
loops and integrate the customer in real-time
Based on Schierning (2016): Digitalization - Challenges and Opportunities for Product Based on Information Management. Presented at the 48th CC CDQ Workshop on February 25th 2016
Company
Source Produce Distribute
Promote
Demand
Activation Cycle
(communicate benefits and create demand)
Order Fulfillment Cycle
(fulfill the demand)
Product/
Service
Consume
/Use
Customer interaction
Personalized products
& services
Industry 4.0
10. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 10
Digital and data-driven businesses rely on closed information
loops – involving customers, suppliers, R&D partners and more
Based on Schierning (2016): Digitalization - Challenges and Opportunities for Product Based on Information Management. Presented at the 48th CC CDQ Workshop on February 25th 2016
Company
Innovation Cycle
(align product / service offering to
customer needs)
Insight
Source Produce DistributeDevelop
Ideation
Benefit Promote
Demand
Activation Cycle
(communicate benefits and create demand)
Order Fulfillment Cycle
(fulfill the demand)
Product /
Service
Idea
Product/
Service
Collaborative
innovation
Regulatory
requirements
Consume
/Use
Customer interaction
Personalized products
& services
Industry 4.0
11. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 11
Data?!?
The data universe is becoming increasingly complex!
Leveling et al.: Big Data Analytics for Supply Chain Management, 2014.
Community & Reference Data:
business partner addresses,
standards, regulations, country
codes, GTINs
Big & Open Data:
sensor data, tweets, social media
streams, weather data, news, …
…
Corporate Nucleus Data:
master data, transaction data,
company documents
Vendor
Product
Customer
12. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 12
Agenda
1. The changing role of data – data as a strategic resource
2. Real-world challenges in the digital and data-driven enterprise
3. Conclusion
13. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 13
In practice, data is hardly managed as a strategic resource
Source:
1) http://www.cio.com/article/2375573/leadership-management/cios-consider-putting-a-price-tag-on-data.html
2) https://hbr.org/2017/09/only-3-of-companies-data-meets-basic-quality-standards
3) https://hbr.org/2016/12/breaking-down-data-silos
“Only 3% of companies’ data meets basic quality
standards.”
Harvard Business Review, September 2017
“It's frustrating that companies have a better sense of the
value of their office furniture than their information assets.”
Douglas Laney, Technology Analyst at Gartner
“80% of the work involved (in advanced data analytics) is
acquiring and preparing data.”
Harvard Business Review, December 2016
14. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 14
Only few companies know the value of their data
Big Data At Caesars Entertainment –
A One Billion Dollar Asset?
The most valuable of the individual assets …
is the data collected over the last 17 years
through the company’s Total Rewards loyalty
program, which gained Caesar’s a reputation
as a pioneer in Big Data-driven marketing.
How much worth is
your data?
https://www.forbes.com/sites/bernardmarr/2015/05/18/when-big-data-becomes-your-most-valuable-asset/#561009e1eefd
15. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 15
Reproduction Cost Method2
Financial valuation of data
Market approach
Based on market prices or multiples
Often not suitable.
In many cases, markets and market
prices for intangible assets do not exist.
What is the price a buyer would pay for
an asset on a competitive market?
Income approach
Present value of cash flows attributable
to an asset.
Suitable.
Cash flows from the use of data are a
good measure for data value.
What is the value that my data generates
in the business processes?
Cost approach
Reproduction or replacement cost
Suitable.
In many cases, data reproduction cost
can be quantified reliably.
How much would it cost to reproduce or
replace an asset?
Approach1)
Concept
Data valuation
context
Leading question
1) Table adapted from IDW S5
2) The cost-based approach for data valuation was developed and practically applied in a prior research project. An overview of the concept and functioning of the cost-based valuation approach is provided in: Schmaus, P. (2015). Bewertung von Stammdaten als Intangible Asset.
Controlling, 27(7), 392–395. doi:10.15358/0935-0381-2015-7-392. For additional documentation and background on the cost-based valuation tool please do not hesitate to contact the authors of this presentation.
3) The cost-based approach for data valuation was developed and practically applied in a prior research project. Zechmann, A. & Möller, K. (2016). Finanzielle Bewertung von Daten als Vermögenswerte. Controlling, 28(10), 558-566.
Quantity of
Customer Master
Master Data
Production Costs
445.579,57 EUR
Average Master
Data Age
29 months Average Quality [%] 89,40%
Total Usage
Impairment
301.212,87 EUR
Total Usage
Impairment [%]
76,88%
Total Quality
Impairment
90.595,04 EUR
Total Quality
Impairment [%]
23,12%
Total Others
Impairment
-13,24 EUR
Total Others
Impairment [%]
0,00%
Total Impairment 391.794,68 EUR
Total Impairment
[%]
87,93%
Value of Customer Master Data
53.784,89 EUR
Phase 2 - Valuation & Analysis
2.3 Calculating value of master data and analysis
Customer Master Data
Spezification
ERP-Data (SAP),
Country=DE,
Account Group
(tbd)
Information about Customer Master Data
Customer Master Data Valuation
10.000
Previous [1.5.3] Process
445.579,57
391.794,68
53.784,89
0,00
50.000,00
100.000,00
150.000,00
200.000,00
250.000,00
300.000,00
350.000,00
400.000,00
450.000,00
500.000,00
Master Data
Production Costs
Total Impairment
Value of Customer
Master Data
76,88%
23,12%
0,00%-25,00%
0,00%
25,00%
50,00%
75,00%
100,00%
Total Usage
Impairment [%]
Total Quality
Impairment [%]
Total Others
Impairment [%]
87,93%Tools and
methods for
application
Use-based valuation3Data value multiplies
Examples:
Data market prices
Example:
A. Zechmann: Data Valuation
16. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 16
Cost approach: Applying a »Reproduction Cost Method«1 to
measure customer master data
The numbers presented are an example and do not represent the actual figures of the valuation case.
1) Schmaus, P. (2015). Bewertung von Stammdaten als Intangible Asset. Controlling, 27(7), 392–395.
What would it cost to produce a perfect duplicate of data with same attributes and the same DQ?
reduced by
Guiding question
Functioning Cost to reproduce data
Adjustment charges due to
lacking DQ
Cost-based data value
equals
A. Zechmann: Data Valuation
General accounting principles
Class No. Data Quality
Impairment
Percentage
1 < 50% 95%
2 ≥ 50%; < 80% 80%
3 ≥ 80%; < 90% 30%
4 ≥ 90%; < 98% 10%
5 ≥ 98% 0%
Class Last Use Category
Impairment
Percentage
1 within last 6 months 0%
2 > 6 months; ≤ 12 months 10%
3 > 12 months; ≤ 24 months 50%
4 > 24 months; ≤ 36 months 75%
5 > 36 months 95%
17. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 17
Cost approach: Applying a »Reproduction Cost Method«1 to
measure customer master data
Company 1
5m €
2m €
3m €
≈500’000 records
9m €
5m €
2m €
3m €
The numbers presented are an example and do not represent the actual figures of the valuation case.
9m €
6m €
5m €
2m €
3m €
-60%
Customer Master Data
Value
9 mn €
Master Data
Production Cost
6 mn €
DQ Adjustment
Charges
15 mn €
Company 2
-40%
Customer Master Data
Value
DQ Adjustment
Charges
3 mn €
2 mn €
5 mn €
Master Data
Production Cost
≈80’000 records
Example
3 : 1
~6 : 1
1) Schmaus, P. (2015). Bewertung von Stammdaten als Intangible Asset. Controlling, 27(7), 392–395.
What would it cost to produce a perfect duplicate of data with same attributes and the same DQ?
reduced by
Guiding question
Functioning Cost to reproduce data
Adjustment charges due to
lacking DQ
Cost-based data value
equals
A. Zechmann: Data Valuation
18. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 18
Income approach: Applying a »Use-based Method«1 to
measure product master data
1) Zechmann, A. & Möller, K. (2016). Finanzielle Bewertung von Daten als Vermögenswerte. Controlling, 28(10), 558-566.
What are economic benefits an organization obtains by using data in specific data use contexts of a business process?
result in
Guiding question
Functioning Data use contexts
Economic benefits given actual
DQ
Use-based data value
equals
610
TEUR
-250
TEUR
-100
TEUR
590
TEUR
Data quality management cost
Cost from using data
610
TEUR
610
TEUR
Year 1Today Year 2 Year 3 Steady state
Cost savings per period
-250
TEUR
-100
TEUR
-250
TEUR
-100
TEUR
-250
TEUR
-100
TEUR
-250
TEUR
-100
TEUR
Cash flows from the use of product
master data in customer service process
Discounted cash flow valuation
Use-based value of
product master data
2.232
TEUR
Valuation
assumptions:
Discount rate: 10%
Growth rate: 0%
The numbers presented are an example and do not represent the actual figures of the valuation case.
A. Zechmann: Data Valuation
19. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 19
Recommendations for managing data as a strategic resource
“It's frustrating that companies have a better sense of the
value of their office furniture than their information assets.”
Douglas Laney, Technology Analyst at Gartner
Assess the business value and impact of data
à Data valuation
20. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 20
In practice, data is hardly managed as strategic resource
“Only 3% of companies’ data meets basic quality
standards.”
Harvard Business Review, September 2017
“It's frustrating that companies have a better sense of the
value of their office furniture than their information assets.”
Douglas Laney, Technology Analyst at Gartner
“80% of the work involved (in advanced data analytics) is
acquiring and preparing data.”
Harvard Business Review, December 2016
Source:
1) http://www.cio.com/article/2375573/leadership-management/cios-consider-putting-a-price-tag-on-data.html
2) https://hbr.org/2017/09/only-3-of-companies-data-meets-basic-quality-standards
3) https://hbr.org/2016/12/breaking-down-data-silos
21. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 21
Data often is not « fit for purpose »
https://hbr.org/2017/09/only-3-of-companies-data-meets-basic-quality-standards
https://hbr.org/2016/07/assess-whether-you-have-a-data-quality-problem
Friday Afternoon Measurement (FAM) Method
• Managers assemble 10-15 critical data attributes for
the last 100 units of work completed by their
departments à 100 data records.
• Managers and their teams work through each
record, marking obvious errors.
• They then count up the total of error-free records
à Data Quality (DQ) Score (between 0-100)
22. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 22
Changing the mindset …
From reacting to data quality “incidents” … … to proactively managing data
Key: „Submarines“ of Master Data Quality (e.g. migrations, process
errors, inconsistent reports).
Master Data Quality
Time
Project 1 Project 2 Project 3
DQ-Optimum
Accuracy
Completeness
2000 2013 2014 2015 2016 2017
Maturity Level
3. Defined 4. Quant. managed 5. OpFmizing
MDM
operaFonal
Build up Network,
Governance,
Improvement
Extended
Governance
Governance
Governance
internal &
external
Material
Customer
Vendor
Data Domains new data domains
Schaeffler’s data management journey
CDQ Award 2016
https://www.cc-cdq.ch/cdq-good-practice-award
23. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 23
… and establishing data ownership in business functions
Automotive
Industrial
BA
BB
BC Regions
Functions
Divisions
Europe
Americas
Greater China
Asia / Pacific
CEO Functions
Operations
Finance
HR
R&D
From functional silos …
… to defined data ownership and
engagement model
Schaeffler’s data management journey
CDQ Award 2016
https://www.cc-cdq.ch/cdq-good-practice-award
24. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 24
Recommendations for managing data as a strategic resource
“Only 3% of companies’ data meets basic quality
standards.”
Harvard Business Review, September 2017
“It's frustrating that companies have a better sense of the
value of their office furniture than their information assets.”
Douglas Laney, Technology Analyst at Gartner
Measure and improve data quality
à Data governance, SMART data management
Assess the business value and impact of data
à Data valuation
25. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 25
In practice, data is hardly managed as strategic resource
“Only 3% of companies’ data meets basic quality
standards.”
Harvard Business Review, September 2017
“It's frustrating that companies have a better sense of the
value of their office furniture than their information assets.”
Douglas Laney, Technology Analyst at Gartner
“80% of the work involved (in advanced data analytics) is
acquiring and preparing data.”
Harvard Business Review, December 2016
Source:
1) http://www.cio.com/article/2375573/leadership-management/cios-consider-putting-a-price-tag-on-data.html
2) https://hbr.org/2017/09/only-3-of-companies-data-meets-basic-quality-standards
3) https://hbr.org/2016/12/breaking-down-data-silos
26. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 26
Changing the mindset …
From data hidden in silos … … to data democratization
F indable
Unique and globally consistent identifier
Metadata description
A ccessible
(meta)data are retrievable standardized
communications protocol
I nteroperable
formal, accessible, shared language for
representation, use of vocabularies
R eusable
data usage license, detailed provenance
domain-relevant community standards
In the digital and data-driven enterprise, data should be
The FAIR Guiding Principles for scientific data management and stewardship
https://www.nature.com/articles/sdata201618
27. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 27
… defining data semantics and make data FAIR
Good practices:
• Data catalogs & business glossaries
• Metadata management - « data about data »
• Semantic integration
Vendor
Order
Customer
Product Business Object Model
Conceptual Models
Customer
Canonical
Models
Physical Models
Product
Customer Prospect Account
Global Customer ID Global Customer ID Global Customer ID
- Account ID Account ID
Customer Name Name Name
ERP CRM MDM HR CMS …
Logical /
Physical Model
Example – Corporate Data League Wiki
28. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 28
… open up data, share and collaborate
Example: Corporate Data League Wiki
https://www.corporate-data-league.ch/meta/
Corporate_Data_League
Example – Open Data @ SBB (https://data.sbb.ch/)
Data in the hands of a few data experts
can be powerful, but data at the
fingertips of many is truly
transformational
https://www.forbes.com/sites/brentdykes/2017/03/09/why-companies-must-close-the-
data-literacy-divide/#3f35f92f369d
29. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 29
Recommendations for managing data as a strategic resource
“Only 3% of companies’ data meets basic quality
standards.”
Harvard Business Review, September 2017
“It's frustrating that companies have a better sense of the
value of their office furniture than their information assets.”
Douglas Laney, Technology Analyst at Gartner
“80% of the work involved (in advanced data analytics) is
acquiring and preparing data.”
Harvard Business Review, December 2016
Democratize data and support data citizens
à Data-sharing culture, FAIR data and applications
Measure and improve data quality
à Data governance, SMART data management
Assess the business value and impact of data
à Data valuation
30. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 30
Agenda
1. The changing role of data – data as a strategic resource
2. Real-world challenges in the digital and data-driven enterprise
3. Conclusion
31. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 31
Effective data management is a foundation of the digital and
data-driven enterprise
“Only 3% of companies’ data meets basic quality
standards.”
Harvard Business Review, September 2017
“It's frustrating that companies have a better sense of the
value of their office furniture than their information assets.”
Douglas Laney, Technology Analyst at Gartner
“80% of the work involved (in advanced data analytics) is
acquiring and preparing data.”
Harvard Business Review, December 2016
Understand and assess the business value of data
à Data valuation
Measure and improve data quality
à Data governance, SMART data management
Democratize data and support data citizens
à Data-sharing culture, FAIR data and applications
32. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 32
Data management is a journey –
Think big, start small & monitor progress!
GOALS ENABLERS
DATA
STRATEGY
PEOPLE, ROLES &
RESPONSIBILITIES
PROCESSES &
METHODS
DATA
LIFECYCLE
DATA
APPLICATIONS
DATA
ARCHITECTURE
PERFORMANCE
MANAGEMENT
BUSINESS
CAPABILITIES
DATA
MANAGEMENT
CAPABILITIES
RESULTS
BUSINESS
VALUE
DATA
EXCELLENCE
The CDQ Data Excellence Model https://cc-cdq.ch/data-excellence-model
33. Competence Center Corporate Data Quality (CC CDQ) | 2017 | 33
Questions?
christine.legner@unil.ch
Professor of Information Systems &
Academic Director Competence Center Corporate Data
Quality (CC CDQ)
HEC Lausanne
Prof. Dr. Christine Legner
Tel.: +41 76 3382782
Competence Center Corporate Data Quality (CC CDQ)
https://cc-cdq.ch/
The CDQ Data Excellence Model
https://cc-cdq.ch/data-excellence-model
HEC Research Blog – Effective Data Management in a
Digitally Driven World
http://wp.unil.ch/hecimpact/effective-data-management-in-a-
digitally-driven-world/