The document discusses challenges related to migrating data from legacy systems to new applications and systems. It notes there are typically many source systems in various formats with incomplete or unknown information. Effective data migration requires understanding source systems, data mapping, quality analysis, and design of the migration process. It also stresses the importance of data governance and quality to ensure migrated data can be effectively used.
View the webinar here - https://bit.ly/2ErkxYY
Enterprises are moving their data warehouse to the cloud to take advantage of reduced operational and administrative overheads, improved business agility, and unmatched simplicity.
The Impetus Workload Transformation Solution makes the journey to the cloud easier by automating the DW migration to cloud-native data warehouse platforms like Snowflake. The solution enables enterprises to automate conversion of source DDL, DML scripts, business logic, and procedural constructs. Enterprises can preserve their existing investments, eliminate error-prone, slow, and expensive manual practices, mitigate any risk, and accelerate time-to-market with the solution.
Join our upcoming webinar where Impetus experts will detail:
Cloud migration strategy
Critical considerations for moving to the cloud
Nuances of migration journey to Snowflake
Demo – Automated workload transformation to Snowflake.
To view - visit https://bit.ly/2ErkxYY
Importance of ML Reproducibility & Applications with MLfLowDatabricks
With data as a valuable currency and the architecture of reliable, scalable Data Lakes and Lakehouses continuing to mature, it is crucial that machine learning training and deployment techniques keep up to realize value. Reproducibility, efficiency, and governance in training and production environments rest on the shoulders of both point in time snapshots of the data and a governing mechanism to regulate, track, and make best use of associated metadata.
This talk will outline the challenges and importance of building and maintaining reproducible, efficient, and governed machine learning solutions as well as posing solutions built on open source technologies – namely Delta Lake for data versioning and MLflow for efficiency and governance.
The Path to Data and Analytics ModernizationAnalytics8
Learn about the business demands driving modernization, the benefits of doing so, and how to get started.
Can your data and analytics solutions handle today’s challenges?
To stay competitive in today’s market, companies must be able to use their data to make better decisions. However, we are living in a world flooded by data, new technologies, and demands from the business for better and more advanced analytics. Most companies do not have the modern technologies and processes in place to keep up with these growing demands. They need to modernize how they collect, analyze, use, and share their data.
In this webinar, we discuss how you can build modern data and analytics solutions that are future ready, scalable, real-time, high speed, and agile and that can enable better use of data throughout your company.
We cover:
-The business demands and industry shifts that are impacting the need to modernize
-The benefits of data and analytics modernization
-How to approach data and analytics modernization- steps you need to take and how to get it right
-The pillars of modern data management
-Tips for migrating from legacy analytics tools to modern, next-gen platforms
-Lessons learned from companies that have gone through the modernization process
View the webinar here - https://bit.ly/2ErkxYY
Enterprises are moving their data warehouse to the cloud to take advantage of reduced operational and administrative overheads, improved business agility, and unmatched simplicity.
The Impetus Workload Transformation Solution makes the journey to the cloud easier by automating the DW migration to cloud-native data warehouse platforms like Snowflake. The solution enables enterprises to automate conversion of source DDL, DML scripts, business logic, and procedural constructs. Enterprises can preserve their existing investments, eliminate error-prone, slow, and expensive manual practices, mitigate any risk, and accelerate time-to-market with the solution.
Join our upcoming webinar where Impetus experts will detail:
Cloud migration strategy
Critical considerations for moving to the cloud
Nuances of migration journey to Snowflake
Demo – Automated workload transformation to Snowflake.
To view - visit https://bit.ly/2ErkxYY
Importance of ML Reproducibility & Applications with MLfLowDatabricks
With data as a valuable currency and the architecture of reliable, scalable Data Lakes and Lakehouses continuing to mature, it is crucial that machine learning training and deployment techniques keep up to realize value. Reproducibility, efficiency, and governance in training and production environments rest on the shoulders of both point in time snapshots of the data and a governing mechanism to regulate, track, and make best use of associated metadata.
This talk will outline the challenges and importance of building and maintaining reproducible, efficient, and governed machine learning solutions as well as posing solutions built on open source technologies – namely Delta Lake for data versioning and MLflow for efficiency and governance.
The Path to Data and Analytics ModernizationAnalytics8
Learn about the business demands driving modernization, the benefits of doing so, and how to get started.
Can your data and analytics solutions handle today’s challenges?
To stay competitive in today’s market, companies must be able to use their data to make better decisions. However, we are living in a world flooded by data, new technologies, and demands from the business for better and more advanced analytics. Most companies do not have the modern technologies and processes in place to keep up with these growing demands. They need to modernize how they collect, analyze, use, and share their data.
In this webinar, we discuss how you can build modern data and analytics solutions that are future ready, scalable, real-time, high speed, and agile and that can enable better use of data throughout your company.
We cover:
-The business demands and industry shifts that are impacting the need to modernize
-The benefits of data and analytics modernization
-How to approach data and analytics modernization- steps you need to take and how to get it right
-The pillars of modern data management
-Tips for migrating from legacy analytics tools to modern, next-gen platforms
-Lessons learned from companies that have gone through the modernization process
Implementing the Data Maturity Model (DMM)DATAVERSITY
The Data Management Maturity (DMM) model is a framework for the evaluation and assessment of an organization’s Data Management capabilities. This model—based on the Capability Maturity Model pioneered by the U.S. Department of Defense for improving software development processes—allows an organization to evaluate its current-state Data Management capabilities, discover gaps to remediate, and identify strengths to leverage. In doing so, this assessment method reveals organizational priorities, business needs, and a clear path for rapid process improvements.
In this webinar, we will:
Describe the DMM model, its purpose and evolution, and how it can be used as a roadmap for assessing and improving organizational Data Management and Data Management Maturity
Discuss how to get the most out of a DMM assessment, including its dependencies and requirements for use
Discuss foundational DMM concepts based on “The DAMA Guide to the Data Management Body of Knowledge” (DAMA DMBOK)
Data-Ed Webinar: Best Practices with the DMMDATAVERSITY
The Data Management Maturity (DMM) model is a framework for the evaluation and assessment of an organization’s data management capabilities. The model allows an organization to evaluate its current state data management capabilities, discover gaps to remediate, and strengths to leverage. The assessment method reveals priorities, business needs, and a clear, rapid path for process improvements. This webinar will describe the DMM, its evolution, and illustrate its use as a roadmap guiding organizational data management improvements.
Takeaways:
•Our profession is advancing its knowledge and has a wide-spread basis for partnerships
•New industry assessment standard is based on successful CMM/CMMI foundation
•Clear need for data strategy
•A clear and unambiguous call for participation
Modernizing the Analytics and Data Science Lifecycle for the Scalable Enterpr...Data Con LA
Data Con LA 2020
Description
It’s no secret that the roots of Data Science date back to the 1960’s and were first mainstreamed in the 1990’s with the emergence of Data Mining. This occurred when commercially affordable computers started offering the horsepower and storage necessary to perform advanced statistics to scale.
However, the words “to scale” have evolved over time. The leap to “Big Data” is only one serial aspect of growth. Beyond the typical 1-off studies that catalyzed the field of Data Mining, Data Science now fulfills enterprise and multi-enterprise use cases spanning much broader and deeper data sets and integrations. For example, AI and Machine Learning frameworks can interoperate with a variety of other systems to drive alerting, feedback loops, predictive frameworks, prescriptive engines, continual learning, and more. The deployment of AI/ML processes themselves often involves integration with contemporary DevOps tools.
Now segue to SEAL – the Scalable Enterprise Analytic Lifecycle. In this presentation, you’ll learn how to cover the major bases of a modern Data Science projects – and Citizen Data Science as well – from conception, learning, and evaluation through integration, implementation, monitoring, and continual improvement. And as the name implies, your deployments will be performant and scale as expected in today’s environments.
Speaker
Jeff Bertman, CTO, Dfuse Technologies
Describes what Enterprise Data Architecture in a Software Development Organization should cover and does that by listing over 200 data architecture related deliverables an Enterprise Data Architect should remember to evangelize.
Many significant business initiatives and large IT projects depend upon a successful data migration. Your goal is to minimize as much risk as possible through effective planning and scoping. This paper will provide insight into what issues are unique to data migration projects and offer advice on how to best approach them.
Slides from tutorial at EDW 2017 in Atlanta, GA on Implementing Agile Data Governance. Discusses how to write and add governance stories into existing Agile projects.
DAS Slides: Data Governance and Data Architecture – Alignment and SynergiesDATAVERSITY
Data Governance can have a varied definition, depending on the audience. To many, Data Governance consists of committee meetings and stewardship roles. To others, it focuses on technical Data Management and controls. Holistic Data Governance combines both of these aspects, and a robust Data Architecture and associated diagrams can be the “glue” that binds business and IT governance together. Join this webinar for practical tips and hands-on exercises for aligning Data Architecture and Data Governance for business and IT success.
Implementing Data Virtualization for Data Warehouses and Master Data Manageme...Denodo
The ongoing evolution of business requirements and growth of data volumes continue to put added challenges on existing DW and MDM implementations. Challenges that in many cases cannot be met. Data Virtualization compliments existing DW, MDM and other architectures and business initiatives, providing the agility and flexibility - at a lower cost – for the enablement of Virtual MDM, self-service BI, operational BI, rapid prototyping and real-time analytics.
More information and FREE registrations for this webinar: http://goo.gl/asYztF
Landing page for the entire Packed Lunch webinar series: http://goo.gl/NATMHw
Attend & get unique insights into:
How Data Virtualization can provide a simple and low cost alternative to traditional DW and MDM solutions
How Data Virtualization can enhance and extend existing DW or MDM solutions to provide a more agile data integration architecture
Case studies that demonstrate how Data Virtualization has increased agility to meet complex information needs
GDG Cloud Southlake #16: Priyanka Vergadia: Scalable Data Analytics in Google...James Anderson
Do you know The Cloud Girl? She makes the cloud come alive with pictures and storytelling.
The Cloud Girl, Priyanka Vergadia, Chief Content Officer @Google, joins us to tell us about Scaleable Data Analytics in Google Cloud.
Maybe, with her explanation, we'll finally understand it!
Priyanka is a technical storyteller and content creator who has created over 300 videos, articles, podcasts, courses and tutorials which help developers learn Google Cloud fundamentals, solve their business challenges and pass certifications! Checkout her content on Google Cloud Tech Youtube channel.
Priyanka enjoys drawing and painting which she tries to bring to her advocacy.
Check out her website The Cloud Girl: https://thecloudgirl.dev/ and her new book: https://www.amazon.com/Visualizing-Google-Cloud-Illustrated-References/dp/1119816327
Data Marketplace and the Role of Data VirtualizationDenodo
Watch full webinar here: https://bit.ly/3IS9sQS
A data marketplace is like an online shopping interface specializing in data. Ideally, it should work just like an online store, with minimal latency and maximum responsiveness. However, this does not mean that all of the data in the data marketplace needs to be stored in the same central repository.
In this session, Shadab Hussain, Americas Sales Head, Data Analytics at Wipro, a partner company with Denodo and a co-sponsor of DataFest 2021, talks about the role of data virtualization in enabling full-featured data marketplaces. Such data marketplaces provide real-time, curated access to data, even when the data is stored across many different sources throughout the organization.
You will learn:
- The main features of a data marketplace
- Why organizations need data marketplaces
- Why data marketplaces sometimes fail
- How data virtualization enables the most effective data marketplaces
- How one of Europe’s premiere public healthcare system organizations leveraged a data marketplace to improve data consumption and ease of access
This migration plan aims to explore the potential of migrating from on-premises Hadoop to Azure Databricks. By leveraging Databricks' scalability, performance, collaboration, and advanced analytics capabilities, organizations can unlock faster insights and facilitate data-driven decision-making.
5 Critical Steps to Clean Your Data Swamp When Migrating Off of HadoopDatabricks
In this session, learn how to quickly supplement your on-premises Hadoop environment with a simple, open, and collaborative cloud architecture that enables you to generate greater value with scaled application of analytics and AI on all your data. You will also learn five critical steps for a successful migration to the Databricks Lakehouse Platform along with the resources available to help you begin to re-skill your data teams.
Business Intelligence & Data Analytics– An Architected ApproachDATAVERSITY
Business intelligence (BI) and data analytics are increasing in popularity as more organizations are looking to become more data-driven. Many tools have powerful visualization techniques that can create dynamic displays of critical information. To ensure that the data displayed on these visualizations is accurate and timely, a strong Data Architecture is needed. Join this webinar to understand how to create a robust Data Architecture for BI and data analytics that takes both business and technology needs into consideration.
Implementing the Data Maturity Model (DMM)DATAVERSITY
The Data Management Maturity (DMM) model is a framework for the evaluation and assessment of an organization’s Data Management capabilities. This model—based on the Capability Maturity Model pioneered by the U.S. Department of Defense for improving software development processes—allows an organization to evaluate its current-state Data Management capabilities, discover gaps to remediate, and identify strengths to leverage. In doing so, this assessment method reveals organizational priorities, business needs, and a clear path for rapid process improvements.
In this webinar, we will:
Describe the DMM model, its purpose and evolution, and how it can be used as a roadmap for assessing and improving organizational Data Management and Data Management Maturity
Discuss how to get the most out of a DMM assessment, including its dependencies and requirements for use
Discuss foundational DMM concepts based on “The DAMA Guide to the Data Management Body of Knowledge” (DAMA DMBOK)
Data-Ed Webinar: Best Practices with the DMMDATAVERSITY
The Data Management Maturity (DMM) model is a framework for the evaluation and assessment of an organization’s data management capabilities. The model allows an organization to evaluate its current state data management capabilities, discover gaps to remediate, and strengths to leverage. The assessment method reveals priorities, business needs, and a clear, rapid path for process improvements. This webinar will describe the DMM, its evolution, and illustrate its use as a roadmap guiding organizational data management improvements.
Takeaways:
•Our profession is advancing its knowledge and has a wide-spread basis for partnerships
•New industry assessment standard is based on successful CMM/CMMI foundation
•Clear need for data strategy
•A clear and unambiguous call for participation
Modernizing the Analytics and Data Science Lifecycle for the Scalable Enterpr...Data Con LA
Data Con LA 2020
Description
It’s no secret that the roots of Data Science date back to the 1960’s and were first mainstreamed in the 1990’s with the emergence of Data Mining. This occurred when commercially affordable computers started offering the horsepower and storage necessary to perform advanced statistics to scale.
However, the words “to scale” have evolved over time. The leap to “Big Data” is only one serial aspect of growth. Beyond the typical 1-off studies that catalyzed the field of Data Mining, Data Science now fulfills enterprise and multi-enterprise use cases spanning much broader and deeper data sets and integrations. For example, AI and Machine Learning frameworks can interoperate with a variety of other systems to drive alerting, feedback loops, predictive frameworks, prescriptive engines, continual learning, and more. The deployment of AI/ML processes themselves often involves integration with contemporary DevOps tools.
Now segue to SEAL – the Scalable Enterprise Analytic Lifecycle. In this presentation, you’ll learn how to cover the major bases of a modern Data Science projects – and Citizen Data Science as well – from conception, learning, and evaluation through integration, implementation, monitoring, and continual improvement. And as the name implies, your deployments will be performant and scale as expected in today’s environments.
Speaker
Jeff Bertman, CTO, Dfuse Technologies
Describes what Enterprise Data Architecture in a Software Development Organization should cover and does that by listing over 200 data architecture related deliverables an Enterprise Data Architect should remember to evangelize.
Many significant business initiatives and large IT projects depend upon a successful data migration. Your goal is to minimize as much risk as possible through effective planning and scoping. This paper will provide insight into what issues are unique to data migration projects and offer advice on how to best approach them.
Slides from tutorial at EDW 2017 in Atlanta, GA on Implementing Agile Data Governance. Discusses how to write and add governance stories into existing Agile projects.
DAS Slides: Data Governance and Data Architecture – Alignment and SynergiesDATAVERSITY
Data Governance can have a varied definition, depending on the audience. To many, Data Governance consists of committee meetings and stewardship roles. To others, it focuses on technical Data Management and controls. Holistic Data Governance combines both of these aspects, and a robust Data Architecture and associated diagrams can be the “glue” that binds business and IT governance together. Join this webinar for practical tips and hands-on exercises for aligning Data Architecture and Data Governance for business and IT success.
Implementing Data Virtualization for Data Warehouses and Master Data Manageme...Denodo
The ongoing evolution of business requirements and growth of data volumes continue to put added challenges on existing DW and MDM implementations. Challenges that in many cases cannot be met. Data Virtualization compliments existing DW, MDM and other architectures and business initiatives, providing the agility and flexibility - at a lower cost – for the enablement of Virtual MDM, self-service BI, operational BI, rapid prototyping and real-time analytics.
More information and FREE registrations for this webinar: http://goo.gl/asYztF
Landing page for the entire Packed Lunch webinar series: http://goo.gl/NATMHw
Attend & get unique insights into:
How Data Virtualization can provide a simple and low cost alternative to traditional DW and MDM solutions
How Data Virtualization can enhance and extend existing DW or MDM solutions to provide a more agile data integration architecture
Case studies that demonstrate how Data Virtualization has increased agility to meet complex information needs
GDG Cloud Southlake #16: Priyanka Vergadia: Scalable Data Analytics in Google...James Anderson
Do you know The Cloud Girl? She makes the cloud come alive with pictures and storytelling.
The Cloud Girl, Priyanka Vergadia, Chief Content Officer @Google, joins us to tell us about Scaleable Data Analytics in Google Cloud.
Maybe, with her explanation, we'll finally understand it!
Priyanka is a technical storyteller and content creator who has created over 300 videos, articles, podcasts, courses and tutorials which help developers learn Google Cloud fundamentals, solve their business challenges and pass certifications! Checkout her content on Google Cloud Tech Youtube channel.
Priyanka enjoys drawing and painting which she tries to bring to her advocacy.
Check out her website The Cloud Girl: https://thecloudgirl.dev/ and her new book: https://www.amazon.com/Visualizing-Google-Cloud-Illustrated-References/dp/1119816327
Data Marketplace and the Role of Data VirtualizationDenodo
Watch full webinar here: https://bit.ly/3IS9sQS
A data marketplace is like an online shopping interface specializing in data. Ideally, it should work just like an online store, with minimal latency and maximum responsiveness. However, this does not mean that all of the data in the data marketplace needs to be stored in the same central repository.
In this session, Shadab Hussain, Americas Sales Head, Data Analytics at Wipro, a partner company with Denodo and a co-sponsor of DataFest 2021, talks about the role of data virtualization in enabling full-featured data marketplaces. Such data marketplaces provide real-time, curated access to data, even when the data is stored across many different sources throughout the organization.
You will learn:
- The main features of a data marketplace
- Why organizations need data marketplaces
- Why data marketplaces sometimes fail
- How data virtualization enables the most effective data marketplaces
- How one of Europe’s premiere public healthcare system organizations leveraged a data marketplace to improve data consumption and ease of access
This migration plan aims to explore the potential of migrating from on-premises Hadoop to Azure Databricks. By leveraging Databricks' scalability, performance, collaboration, and advanced analytics capabilities, organizations can unlock faster insights and facilitate data-driven decision-making.
5 Critical Steps to Clean Your Data Swamp When Migrating Off of HadoopDatabricks
In this session, learn how to quickly supplement your on-premises Hadoop environment with a simple, open, and collaborative cloud architecture that enables you to generate greater value with scaled application of analytics and AI on all your data. You will also learn five critical steps for a successful migration to the Databricks Lakehouse Platform along with the resources available to help you begin to re-skill your data teams.
Business Intelligence & Data Analytics– An Architected ApproachDATAVERSITY
Business intelligence (BI) and data analytics are increasing in popularity as more organizations are looking to become more data-driven. Many tools have powerful visualization techniques that can create dynamic displays of critical information. To ensure that the data displayed on these visualizations is accurate and timely, a strong Data Architecture is needed. Join this webinar to understand how to create a robust Data Architecture for BI and data analytics that takes both business and technology needs into consideration.
True or False? 10 M&A assumptions private companies should be testingDeloitte Canada
The state of our economy shouldn’t be reason for private companies not to pursue mergers and acquisitions. Any deal can carry risk at any time. What matters is how you manage it.
Simplifying M&A Consolidation | Salesforce Mergers and Acquisitions: Deamforc...Jade Global
Dreamforce 16: Simplifying M&A Consolidation
October 5th at 9:30AM, Cloud Expo: Partner Theater 2.
by Jai Kumar And Stella Sy
For more details, please visit: http://www.jadeglobal.com or contact us at marketing@jadeglobal.com
An Introduction into the design of business using business architectureCraig Martin
Business Architecture is gaining interest from many non-traditional architecture stakeholders across the enterprise however most remain unclear of its scope and application. This webinar was presented through the Open Group as lead up to the London 2013 Conference on business transformation. It provides an overview of the language, methods and techniques of developing a business architecture and assist architects to demonstrate its relevance to business leaders. It also provides an insight into the method and techniques taught in the "Discovering Business Architecture" course run by Enterprise Architects.
Make Your Business More Flexible with Scalable Business Process Management So...Perficient, Inc.
Architecture for scalable BPM solutions
Introduction
The role and shortcomings of SOA
Integrating legacy applications with the BPMS
Building high-performance BPM solutions
The role of a business rules management system in your architecture
Architecture to support event-driven business processes to reduce latency in business processes and the company as a whole
Building a business intelligence architecture fit for the 21st century by Jon...Mark Tapley
Objectives of the presentation:
To record some history –what has happened in the past that makes the future quite challenging.
To provide real examples of BI at work –good and bad.
To illustrate the nature of data and why it has become so important in driving forward
the business in the 21stcentury.
To outline a way to align technology with the business so that efforts and budget are spent
in a way that will enable the future rather that support the past.
To propose a set of principles and ideas that can guide a company in a way to make data available to all who have the penchant to turn it into useful and valuable information.
To describe the new organisation unit that will be needed to realise the dream.
The Next Generation of Big Data AnalyticsHortonworks
Apache Hadoop has evolved rapidly to become a leading platform for managing and processing big data. If your organization is examining how you can use Hadoop to store, transform, and refine large volumes of multi-structured data, please join us for this session where we will discuss, the emergence of "big data" and opportunities for deriving business value, the evolution of Apache Hadoop and future directions, essential components required in a Hadoop-powered platform, and solution architectures that integrate Hadoop with existing data discovery and data warehouse platforms.
Enterprise Enabler provides a broad spectrum of connectivity to hundreds of data sources, along with robust data transformation services that eliminate the constraints of the typical transformation engine, by handling all data in native form. This foundation of event-driven, real-time data-streaming technology moves information efficiently through business processes, workflows, SOA and composite applications.
TeleManagement Forum OSSera Case Study - AIS Thailand Service Manager Present...Mingxia Zhang, Ph.D.
Tuesday, February 7th, 5:30 - 5:50 PM
Using Frameworx in Implementing a Unified Service Management Tool –Improving Organizational Collaboration and Communication
Examining the drivers for developing a Unified Service Management Tool to improve business processes at the service level in the Strategy, Infrastructure, and Product (SIP) area as well as Operations.
Outlining the development of an enterprise-wide Service Management application, which enabled solidification of the Service Development and Management processes in the SIP area and Service Management and Operation processes
Quantifying the benefits in terms of information sharing, process unification/implementation, cost saving and revenue increasing in service management
This is a information-packed presentation on data migration made by BWIR, global solutions and services partner to SolidWorks Enterprise PDM. This was showcased at SolidWorks World 2011 and the presentation talks about data migration from other PDM/PLM systems to SolidWorks EPDM.
5 IT Trends That Reduce Cost And Improve Web Performance - A Forrester and Go...Compuware APM
Virtualization, Cloud Computing and Outsourcing promise significant cost savings and enhanced business agility. Implemented correctly these initiatives can cut hardware and software costs, improve web application performance and quality, and positively impact business results. Learn how these 5 key business and technology trends are enabling companies to reduce costs AND ensure web application performance:
1. Virtualization
2. Outsourced Hosting & Management of Applications
3. Cloud Computing
4. Real-user Monitoring
5. ‘SaaS’ification of IT Management Software
Webinar: Successful Data Migration to Microsoft Dynamics 365 CRM | InSyncAPPSeCONNECT
This #Webinar will cover everything you should know to prepare for a Successful CRM Data Migration. Understand the intricacies of data and it's importance in your organization and explore the possibilities of successful Data Migration to your Microsoft Dynamics CRM Platform.
A CRM or Customer Relationship Management (CRM) solution is an essential component in a business as it takes into account all the details of the customers and their journey. But a CRM is never functional without data! That is why, moving data from one system to another is essential in order to set up a new system to utilize the data that already exists in the current system(s). This a must for organizations who want to nurture and help their customers grow.
Data Migration can be a complex and cumbersome process, more complex than people realize, but with a solid strategy in place, it can help organizations seamlessly transfer data from one system to another.
Most Data Migration solutions only transfer Master data, but Transactional data is as much valuable and the right solution and tools can manage that as well. While you need to consider data sources, data fields and other aspects while Migrating Data to Microsoft Dynamics CRM, this webinar will help you learn about the correct approach, best practices and actions involved during the process.
#MSDyn365 #MSDynCRM
The key points to be covered in the webinar are:
- Introduction to Data Migration
- A Guide to Prepare Templates
- Ways to do Data Cleaning
- Options for Data Import
- How to do Data Verification
- Successfully Migrating Data to Dynamics 365 CRM
If you are planning to employ Microsoft Dynamics 365 CRM in your organization, this webinar will help you strategize about CRM data migration and plan for a seamless experience.
Start your #DataMigration today: https://insync.co.in/data-migration/
Webinar: Successful Data Migration to Microsoft Dynamics 365 CRM | InSync
Data Migration and MDM - DMM5
1. The Data Migration Challenge:
Elements including MDM
by Wael Elrifai
London - New York - Dubai - Mumbai - Hong Kong 2012
2. Understanding Migration
Assumptions
Few source Specific All Data Documented Valid
systems Data Available System Data
Formats Interfaces
T R U T H
Many More Data in Needed Data Unknown Poor Data
Source unknown is Missing System Quality
Systems formats Interfaces
“Migration is not just about moving the data…
It’s about making the data work.”
Confidential - not for redistribution
3. These Application Projects have a Common Critical
Requirement: Migrating Data
Application Implementation From legacy into new application
Application Upgrade From previous to new version
Application Instance
Consolidation From multiple instances to fewer
M&A Integration From acquired systems
Legacy Retirement From legacy into new systems
Outsourcing
From company to outsourcer
4. Project Overview: Data Migration to ERP
• 200+ source systems
• Operating in 14 languages
• Different sets of users working in different regions with different
applications and languages
• Highly fragmented lines of business and regions
• No concept of Data Governance or Master Data Management
• No concept of Data Quality Analysis
5. Methodology: Practical Data Migration
Landscape Gap Analysis & Migration Design
Analysis Mapping & Execution
(LA) (GAM) (MDE)
Legacy
Technical
Decommissioning
Migration (LD)
Controller
Migration Strategy &
Profiling Tool
Data Quality Tool
Governance
DMZ
(MSG)
Data Quality Rules
(DQR)
Engagement
Key Data Stakeholder System Retirement Plan
Business
Management (SRP)
(KDSM)
6. Team Structure & Communications
• Primary Business Team located in Hong Kong
• 6 Business Analysts
• 2 Technical Coordinators
• Primary Development Team in Hong Kong
• 8 Developers
• Offshore Development Team in Mumbai, India
• 4 Developers
• Unique Aspects
• Agile/Scrum meetings conducted via Video Conference
• Email usage limited
• Assigned secretary with output immediately posted on Wiki for comments
• Team Lead makes final “closing comments” on each issue
7. Application Migration: The Anatomy of Failure
Long development times
•Often many months or even years without any „visible‟ signs of
progress
•CAUSE: failure to properly decompose development into practical,
achievable and meaningful „phases‟ and „sprints‟
Long development times – for individual ETL flows
•Due to extensive and repeated re-working of ETL code
•Resulting from failures in unit testing and user acceptance testing
•CAUSE: poor and inadequate design
Considerable variations in quality & efficiency of code
•Increasing time for new/other developers to modify code
•CAUSE: failure to define and firmly enforce standards
8. Application Migration : The Anatomy of Failure
Minimal attention to data cleansing or standardisation
•Leading to longer report development times
•And greater inconsistencies in reporting
•Effectively pushing data quality management to report developers
•AND information consumers
•CAUSE: failure to recognise importance and impact of employing
a systematic approach to managing data quality
Poor reliability
•Arising from „unexpected‟ variations in structure or content of
incoming source files
•CAUSE: failure to cater for Murphy‟s Law – i.e. the most frequent
and most obvious causes of
9. Application Migration : The Anatomy of Failure
Poor performance
•CAUSE: failure to give due consideration to scale and complexity
of ETL processes – during the design stage
•CAUSE: failure to fully understand the underlying causes – when
performance problems become evident
•CAUSE: failure to routinely monitor performance or undertake
adequate capacity planning – to cater for gradual or step-change
increases in data volumes
10. Application Migration: The Anatomy of Success
Entity Level Data Model Design
„MAPPING‟ & ETL Phasing
TEMPLATES
REUSABLE
Forensic
Sprint COMPONENTS
Hosted Data Analysis Code
Go Live
Translations
Soft Detailed &
Go Live Functional Design Master
Schedule
UAT Detailed
Technical Design
Enforce
Including Peer Review
System Standards
Master
Test Technical Authority &
Schedule
Reusable
Components
Peer Review Build
Technical Authority Unit Test
11. Abstraction of Rules & Reusability
• Automated ETL mapping development based on source system metadata
• Automated data type verification for flat file data based on header information
•Consistent use of a single value mapping table abstracted to accommodate data
migration rules
• Automated data type verification for flat file data based on header information
•Single generic “run script” which operates based on a simple dependency
matrix
• This is more important in operational rather that data migration
situations, but becomes important when dependencies are complex
12. Data Migration Guiding Principles
Creating Data Standards to Reduce Complexity
Future State Environments Create Entity Attribute Model
• Enterprise Apps Data
Models
• ODS Data Models
ODS
Common Data Standards
Enterprise Representation
Current State • Create Domain Model DW
Environments • Create Entity Model
• Source Tables • Create Entity Relationship
• Source Attributes Model
• Upstream Sources
Customer
• Downstream Targets
• Create as is Domain Model
• Create as is Entity Model ETC
Initial Common Data
Rationalize Domains and Rationalize Attributes across Standards and creation of:
Entities across Current State Map in all Application
Current State and Future •Initial DQ Program Environments to the
and Future State State Environments •Initial Data Ownership Model
Environments Enterprise Standard
•Initial Data Management
•Governance Processes
Confidential - not for redistribution
14. Data Governance - 14-step (sounds like a lot!) program
1. Review available documentation on process flow
2. Agree scope of work
3. Plan and schedule meetings
4. Produce initial definitions of DG framework
5. Assemble DG working group
6. Engage with Data Stewards
7. AS-IS business process analysis
8. AS-IS data analysis
9. Define TO-BE processes
10. Define TO-BE system requirements
11. Assemble business glossary
12. Introduce standardization of business-critical data items
13. Implement DG KPI tracking and DQ exception reporting
14. Conduct periodic audit of business processes
15. Master Data Management - Highlights
• DON‟T FORGET! Your data migration tools may end up being the
real-time MDM Hub communication logic/tools as well, design
appropriately
• Simplified load tools that can be used by analysts
• Custom match/merge algorithms
• Gray‟s coding
• 14 languages including European, Middle Eastern (right-to-left), East
Asian
• Some transliteration rules built using statistical regression on 30m
customer records
• Match/merge algorithms with discrete variables and user interface
• Ability to allow users to target hotspots
• Variable “sliders” - Meshed variables for hotspot analysis allows for
more merge sensitivity flexibility
• Data analysis for predicting why false positives and false negatives
occur
• Role of each source
• Types of data that most often “fails”
• Google Maps/Address integration for matching (cloud), data
enhancement, and more
16. Testing
• Custom “Black Box” testing tool designed
• Specialized for database tests
• Requires addition of some metadata columns to data model
• S_ID
• Batch_ID
• LOAD_TIME
• Automatic storage of test cases
• Test data
• Documentation on test being run
• User metadata
• Test metadata
• Sets database into a known state
• Can generate test data
• Single unified interface
• Fault-Fix workflow management
17. Documentation
• Automated
• Driven by
• Business requirements documented in
• Custom testing tool
• Wiki documentation
• ETL tool metadata
• Custom testing tool metadata
This is highly contingent on being able to enforce developer rules
about documentation within tools.
18. Risk Mitigation
Extract data early
• Data should be seen immediately. We‟ve seen problems come up because
data didn‟t conform to expectations.
Convert data early
• Our existing build will allow for the first conversion to take place within
weeks for all objects.
Convert data often
• An iterative approach to both data quality and conversion allows for
repeated analysis. This should be driven by development schedules rather
than inversely by validation schedules that aren‟t related to development
time.
Use real data from the start
• Conversion team should have direct access to source systems, without a
dependency on another team to create extracts.
Seek to incorporate external and up-to-date information about your
Master Data
• Tools like Google‟s business services, D&B, Bloomberg and others can
help
19. Data Migration through Information Development
Lessons Learned
Prioritise Planning
• Define business priorities and start with quick wins
• Don't do everything at once – Deliver complex projects through an incremental
programme
• “Chunks” need to be appropriate, based on elements like homogeneity of front-
end, single sets of business users across geographies, language usage, etc.
Focus on the Areas of High Complexity
•Don't wait until the 11th hour to deal with Data Quality issues – Fix them early
•Follow the 80/20 rule for fixing data – Does this iteratively through multiple cycles
•Understand the sophistication required for Application Co-Existence and that in the
• In the short term your systems will get more complex
Keep the Business Engaged
• Communicate continuously on the planned approach defined in the strategy The overall
Blueprint is the communications document for the life of the programme
• Try not to be completely infrastructure-focused for long-running releases – Always
deliver some form of new business functionality
• Align the migration programme with analytical initiatives to give business users more
access to data
• Ensure that the Data Governance program has “teeth”
Confidential - not for redistribution
20. Questions?
?
Peak Consulting UK Headquarters
90 Long Acre, Covent Garden
London WC2E 9RZ
T: +44 (0)20 7849 3422
F: +44 (0)20 7990 9478
www.peakconsulting.eu
Confidential - not for redistribution