Foundational strategies
for trusted data
Getting your data to the cloud
Ashwin Ramachandran | Director of Product
Management, Data Integration
Accuracy
Consistency
Context
Data
integrity is
Trust
Data
integrity is
Trusted data requires
Visibility into all data
Real-time sharing of data
Removal of data silos –
on-prem to the cloud
Strong foundation in
data integration
What does a strong data integration
foundation enable you do to do?
Centralized BI and
analytics
Data discovery Data
democratization
with governance
Next-gen projects
– AI and ML
Data integration, easier said than done
Silos of multi-
structured data
Legacy IT
infrastructure
Data archives
Employees
Trends impacting data integration success
• Almost every enterprise has data silos that
prevent enterprise-wide access to data
• More than half of enterprises rely on legacy
systems to run more than half of their
business-critical applications
• Distributed cloud architectures promise
agility but do not readily integrate with
existing infrastructure
• Cloud data platform market consolidation
creates uncertainty
Ingredients of successful data integration
1. Clear
business case 3. Extract data
2. Understand
architecture 4. Scale delivery
Understand your
architecture
Growing demand for data integration architectures
that are flexible, agile, and adaptable to rapid change.
• Distributed cloud architectures promise agility but
may not readily integrate with existing infrastructure
• New requirements for cloud data platforms may
break current data integration architectures
• Critical to understand business applications that will
be impacted by movement of legacy systems to
the cloud
Shifting your architecture
Source: McKinsey & Company, July 2020
1 2 3 4 5 6
From on-
premises to
cloud-based
data
platforms
From batch
to real-time
data
processing
From pre-
integrated
commercial
solutions, to
modular
approaches
From point-
to-point to
decoupled
data access
From
enterprise
data
warehouses
to domain-
based
architectures
From rigid
data models
to flexible,
extensible
data
schemas
Extracting the right data
Legacy data can provide a treasure-trove of
information that can transform your business when
leveraged via a streaming paradigm.
• Connect applications together, leveraging the
existing transactional capabilities of the current
application platform, and the wealth of new
capabilities of the cloud
• Feed analytics with up-to-date information so your
business runs on current insight
• Port workloads to less-expensive, strategic platforms
The importance
of legacy data
of executives say their
customer-facing applications
are completely or very reliant
on mainframe processing.
55%
Your traditional systems
– including mainframes, IBM i
servers & data warehouses –
adapt and deliver increasing
value with each new technology
wave
•72%
increase in transaction
volume on mainframe
environments in 2019
$1.65trillion
invested by enterprise IT
to support data warehouse &
analytics workloads over the past
decade
Forrester Consulting, 2019
Wikibon “10-Year Worldwide Enterprise IT Spending 2008-2017”
BMC, 2019
Scaling delivery
Growing data integration initiatives should not
require increased spend or additional expertise.
• Consider data integration frameworks that can
handle large data volumes
• Determine how legacy data will streamed into
cloud environments
• Understand how a hybrid, multi or cloud only
deployment can enable scalability
Considerations
for scaling
Ask yourself these questions when looking to
address how you account for increased data
volumes, sources, and targets.
• Do you have a solution in place for data
integration that can future-proof DI workflows?
• Where is the performance “choke-point” for data
integration today? How will you address it?
• As volumes grow, how will you share only the
changed data with those that need it?
Best practice in action, demo
Connect and Snowflake
IBM i
Traditional ETL sources,
files, RDMBS, etc.
Convert mainframe, IBM i and
data from other sources to be
shared anywhere on
Snowflake
BI and Analytics
Tools
Deploy Connect capabilities
on-prem, in cloud or hybrid
environments
Mainframe
Looking at the next 90 days…
• Define your business case
• Ensure you are defining an architecture that will serve you across cloud
environments
• Remember valuable data lives in legacy data sources
• Understand how your team can scale for data integration today and
tomorrow
Join us for more sessions on developing your data integrity strategy!
The Precisely Data Integrity Suite
• Delivers the essential elements of data integrity –
accuracy, consistency, and context
• Built on data integration, data quality, location
intelligence, and data enrichment trusted by over
12,000 enterprise customers
• Modular architecture allows you to choose just the
capabilities your need – and implement them
alongside your current infrastructure at scale
• Empowers faster, confident decision-making
with trusted data
Data
Integration
Data
Enrichment
Location
Intelligence
Data
Quality
Learn more at
precisely.com/data-integrity

Foundational Strategies for Trusted Data: Getting Your Data to the Cloud

  • 1.
    Foundational strategies for trusteddata Getting your data to the cloud Ashwin Ramachandran | Director of Product Management, Data Integration
  • 2.
  • 3.
  • 4.
    Trusted data requires Visibilityinto all data Real-time sharing of data Removal of data silos – on-prem to the cloud Strong foundation in data integration
  • 5.
    What does astrong data integration foundation enable you do to do? Centralized BI and analytics Data discovery Data democratization with governance Next-gen projects – AI and ML
  • 6.
    Data integration, easiersaid than done Silos of multi- structured data Legacy IT infrastructure Data archives Employees
  • 7.
    Trends impacting dataintegration success • Almost every enterprise has data silos that prevent enterprise-wide access to data • More than half of enterprises rely on legacy systems to run more than half of their business-critical applications • Distributed cloud architectures promise agility but do not readily integrate with existing infrastructure • Cloud data platform market consolidation creates uncertainty
  • 8.
    Ingredients of successfuldata integration 1. Clear business case 3. Extract data 2. Understand architecture 4. Scale delivery
  • 9.
    Understand your architecture Growing demandfor data integration architectures that are flexible, agile, and adaptable to rapid change. • Distributed cloud architectures promise agility but may not readily integrate with existing infrastructure • New requirements for cloud data platforms may break current data integration architectures • Critical to understand business applications that will be impacted by movement of legacy systems to the cloud
  • 10.
    Shifting your architecture Source:McKinsey & Company, July 2020 1 2 3 4 5 6 From on- premises to cloud-based data platforms From batch to real-time data processing From pre- integrated commercial solutions, to modular approaches From point- to-point to decoupled data access From enterprise data warehouses to domain- based architectures From rigid data models to flexible, extensible data schemas
  • 11.
    Extracting the rightdata Legacy data can provide a treasure-trove of information that can transform your business when leveraged via a streaming paradigm. • Connect applications together, leveraging the existing transactional capabilities of the current application platform, and the wealth of new capabilities of the cloud • Feed analytics with up-to-date information so your business runs on current insight • Port workloads to less-expensive, strategic platforms
  • 12.
    The importance of legacydata of executives say their customer-facing applications are completely or very reliant on mainframe processing. 55% Your traditional systems – including mainframes, IBM i servers & data warehouses – adapt and deliver increasing value with each new technology wave •72% increase in transaction volume on mainframe environments in 2019 $1.65trillion invested by enterprise IT to support data warehouse & analytics workloads over the past decade Forrester Consulting, 2019 Wikibon “10-Year Worldwide Enterprise IT Spending 2008-2017” BMC, 2019
  • 13.
    Scaling delivery Growing dataintegration initiatives should not require increased spend or additional expertise. • Consider data integration frameworks that can handle large data volumes • Determine how legacy data will streamed into cloud environments • Understand how a hybrid, multi or cloud only deployment can enable scalability
  • 14.
    Considerations for scaling Ask yourselfthese questions when looking to address how you account for increased data volumes, sources, and targets. • Do you have a solution in place for data integration that can future-proof DI workflows? • Where is the performance “choke-point” for data integration today? How will you address it? • As volumes grow, how will you share only the changed data with those that need it?
  • 15.
    Best practice inaction, demo
  • 16.
    Connect and Snowflake IBMi Traditional ETL sources, files, RDMBS, etc. Convert mainframe, IBM i and data from other sources to be shared anywhere on Snowflake BI and Analytics Tools Deploy Connect capabilities on-prem, in cloud or hybrid environments Mainframe
  • 17.
    Looking at thenext 90 days… • Define your business case • Ensure you are defining an architecture that will serve you across cloud environments • Remember valuable data lives in legacy data sources • Understand how your team can scale for data integration today and tomorrow Join us for more sessions on developing your data integrity strategy!
  • 18.
    The Precisely DataIntegrity Suite • Delivers the essential elements of data integrity – accuracy, consistency, and context • Built on data integration, data quality, location intelligence, and data enrichment trusted by over 12,000 enterprise customers • Modular architecture allows you to choose just the capabilities your need – and implement them alongside your current infrastructure at scale • Empowers faster, confident decision-making with trusted data Data Integration Data Enrichment Location Intelligence Data Quality
  • 19.

Editor's Notes

  • #6 Centralized business insights – central management of business insights, helps to shift insights from one offs in isolation to Data discovery - business end-users can work with large data sets and get answers to questions they are asking. Data Discovery is helping the enterprise lose some of the bulk when it comes to running analytics. Data democratization – enables more users to have autonomy with data but without the risk of exposing sensitive data in a way that could violate regulations or internal best practices
  • #7 When it comes to building up unified analytics platforms there is a level of complexity that exists across an enterprise We have silos of multi-structured data difficult to integrate (ERP, CRM, mainframes, RDBMS, Files, logs, cloud data sources) heterogeneous legacy IT infrastructure (EDWs, data lakes, marts, severs, storage, archives and more) and thousands maybe more of employees and lots of inaccessible information
  • #8 Almost every enterprise, silos prevent enterprise-wide access to data, data analysis, and insight delivery. Projects happen in departmental or organizational silos, and data, knowledge, and insights are corralled there when they could deliver tremendous value if deployed across the organization Legacy is the mainstay but not effectively integrated System cut-over needs to happen without disrupting the business 2019 marked by market consolidation of big data vendors​​ - Cloudera and Hortonworks merge​​, MapR disbands​, New vendors rising like Databricks that are born in the cloud – we see many customers waiting for the dust to settle
  • #11 Article for reference - https://www.mckinsey.com/business-functions/mckinsey-digital/our-insights/how-to-build-a-data-architecture-to-drive-innovation-today-and-tomorrow#
  • #13 Most large enterprises have made major investments in data environments over a period of many years - legacy data can provide a treasure-trove of information that can transform your business when leveraged via a streaming paradigm These environments contain the data that these business run on and that today power the strategic initiatives driving the business forward – machine learning, AI and predictive analytics Legacy platforms (mainframe and IBM i) continue to adapt with each new wave of technology and are not going away anytime soon Integrating legacy data into your projects brings several advantages such as: Connect applications together, leveraging the existing transactional capabilities of the current application platform, and the wealth of new capabilities of the cloud Feed analytics with up-to-date information so your business runs on current insight Port workloads to less-expensive, strategic platforms
  • #14 Integrating legacy data into data streaming can be problematic for several reasons. First, a legacy data source may not have native connectivity to a downstream target. Second, processing requirements of legacy data can cause a slowdown in a data stream. Loading hundreds, or even thousands, of database tables into a big data platform – combined with an inefficient use of system resources – can create a data bottleneck that hampers your streaming data pipelines from the start.
  • #15 Look for solutions that insulate your organization against the underlying complexities of your technology stack Consider that faster data delivery may break current data pipeline structures Unrivaled scalability and efficiency that improves the speed with which data is shared across business applications, and minimize network impact, by replicating only data changes that have been captured
  • #19 And that is why Precisely has introduced the Precisely Data Integrity Suite. It delivers the essential elements of data integrity – accuracy, consistency, and content – to give your business the confidence to make better, faster decisions based on trusted data. Built on proven data integration, data quality, location intelligence, and data enrichment capabilities trusted by more than 12,000 global organizations, the Precisely Data Integrity Suite delivers unmatched value for any data integrity initiative. And with a modular architecture, you can pick just the capabilities you need, implement them alongside your current infrastructure, and add-on new capabilities as your needs grow.