The rapid rate at which new technologies for cloud-based computing are being developed and deployed – like big data analytics, the Internet of Things, machine learning and artificial intelligence - is truly astonishing. Looking further out into the future, the accelerating scale and speed of such advancements is compelling organizations to migrate the majority of their data and infrastructure to the cloud. Even the Mainframe.
It’s not that mainframe computing systems are going away any time soon. In fact, mainframe platforms are just as essential today for running global, enterprise-scale businesses as they have ever been. And despite years of pundits advocating otherwise, actually migrating off mainframe and onto cloud platforms has very rarely proven to be worth the cost and business risk involved.
But for today’s IT leaders tasked with moving enterprise IT operations forward to that integrated, future-proof, and democratized future, there is no avoiding making these changes while still running an efficient business. Organizations are turning to Precisely to help them maintain current operations while bringing their mainframe data to the cloud.
During this session we discuss:
Why it's imperative for IT leaders to unlock access to mainframe data in a cloud-first worldThe challenges and complexities of democratizing data with mainframe systemsHow to successfully integrate mainframe data into cloud-based environments
3. Business trends influencing how companies
are moving to the cloud
Financial
responsibility for
IT expenditures
Data-centric cloud
computing
Democratized
data
4. “85% of organizations
will embrace a cloud-
first principle by 2025”
- Gartner
“55% of leaders site
data modernization
as the reason for
their shift to cloud”
- Deloitte
“Approximately
$100 billion of wasted
migration spend is
expected over the
next three years”
- McKinsey & Company
5. Migration comes at a
heavy cost
• Over 75% of cloud migration projects
are overbudget
• 37% is spent on systems
integrators since they do not have
the cloud skills in house
• 15% on decommissioning costs
• 38% of cloud migration projects run
behind schedule
• Companies are looking to staff 50%
of cloud talent in house
7. IT is challenged with moving
mainframe data to the cloud
IT execs and managers
need to ensure security,
quality, and governance
of all data
Huge gap between
distributed, democratized,
“cloud-native everything”
and data coming from
the mainframe
7
Mainframes are a legacy technology, why should we care?
=
8. Confidential: Prepared for Precisely Customers and Prospects
Mainframes host the most critical applications
71%
Fortune 500
2.5 Billion
Transactions / day / per MF
$2.9 Billion
Mainframe market by 2025
Top World
Banks
92 of World’s
Top Insurers
10 of Top 25
US Retailers
23
9. Modern platforms lack native
mainframe integration
Distributed and Cloud environments
Mainframe
10. Democratizing mainframe
data for cloud analytics
IT need standards for infrastructure
while it remains seamless
to employees
Data Integrity is needed
Users demand flexibility
11. Exploding need for trusted data
83% of CEOs
want their
organization to be
more data-driven
Digital
transformation
investments to
top $6.8 trillion
globally by 2023
68% of Fortune
1000 businesses
now have CDOs –
up 6x in the
last decade
Global data
infrastructure
spending expected
to reach $200
billion this year
Data is the fuel for decision-making today
IDC
IDC Gartner Forbes
12. of data practitioners
completely trust
their data
of data practitioners
strongly believe their
actions are driven
by data analysis
30%
27%
But your team doesn’t trust your data
Can’t get it fast enough
Don’t understand it
Can’t trust it
Don’t have the context to use it
Don’t know when it’s going to break
Source: IDC
13. For trusted data,
you need data integrity
Data integrity is data with maximum
accuracy, consistency, and context for
confident business decision-making
Data
Integrity
15. Building an environment that can adapt
1
Data sharing
models
2
Business process
changes
3
Cloud
environment
interoperability
4
Deep IT
management
integration
18. Cloud / VPC / On-Premises
Data
Integration
Data
Observability
Data
Quality
Geo
Addressing
Spatial
Analytics
Data
Governance
Data
Enrichment
APIs and SDKs
Enterprise
Business Systems
• Enterprise apps
• Analytics tools
• Precisely industry apps
• BI dashboards
• AI/ML
Enterprise
Data Sources
• Business Intelligence
• CRM
• Workforce mgmt.
• Data warehouse
• ERP
• Billing
Data Integrity services
Data Integrity Foundation Data catalog Intelligence Agents
19. How the Data Integrity Suite does it better
Modular Interoperable Easy Intelligent Dynamic
21. Data Integration differentiators
Real-time data streaming gives you fast access to fresh data when and where
you need it
Business-friendly user interface allows first-time users to create data pipelines
without coding
Build once, deploy anywhere principle allows you to build data pipelines in the
Precisely Cloud and deploy them wherever your data lives
50+ years of domain expertise in mainframe and IBM i systems is built into the
Data Integration module to handle your complex data sources
Integration with the Data Integrity Suite Foundational enables Data Integration
to share metadata with other modules - exponentially building value and spurring
innovation
22. • Provides a single, searchable inventory of your organization’s data assets
• Allows technical users to easily search, explore, understand, and
collaborate on critical data assets
• Enables data stewards to monitor, audit, certify, and track data across its
lifecycle through integrated data governance
• Allows for visualization of data relationships, data lineage, and data’s
business impact
• Supports the sharing of knowledge, comments, and surveys
Data
Catalog A data catalog is core to managing data integrity
23. • Catalogs business and technical metadata from 100s of data sources –
even the most complex sources like IMS and VSAM
• Provides easy searchability and visibility into business and technical metadata
• Automatically captures metadata through any Data Integrity Suite connection
– for data replication, data governance, data quality, and more – and makes
it available for the creation of inferences and recommendations such as:
• Detection of PII prior to replicating it to another system
• Data observability recommendations for replicated data
• Data quality rule recommendations for observed data
• Data enrichment recommendations and geospatial context
for data quality pipelines
• Recommendations for governing ownership of spatial data
The data integrity suite’s data catalog is unique
Data
Catalog
Research shows that:
85% of organizations will embrace a cloud-first principle by 2025
55% of leaders site data modernization as the reason for their shift to cloud
Approximately $100 billion of wasted migration spend is expected over the next three years
So despite the excitement around getting to the cloud, organizations need to be careful…
Because moving to cloud comes with a heavy cost if not done strategically.
McKinsey & Company recently conducted a study that shows that:
Over 75% of cloud migration projects are overbudget
37% is spent on systems integrators since they do not have the cloud skills in house to manage these projects
15% on decommissioning costs for other platforms
38% of cloud migration projects run behind schedule
Companies are looking to staff 50% of their cloud talent in house so they do not need to rely as heavily on third parties
(If you are interested in learning more about this topic, all of this data came from this McKinsey Study https://www.mckinsey.com/industries/technology-media-and-telecommunications/our-insights/cloud-migration-opportunity-business-value-grows-but-missteps-abound)
Mainframes are still the backbone for the biggest organizations in the world
71% of the fortune 500 rely on the mainframe for their mission critical transactional systems and they are present in every vertical from FinServ to Insurance to Retail.
When talking to these organizations, it’s not unusual to hear that up to 80% of their corporate data originates on the mainframe and that business is growing. The mainframe market is expected to grow to $2.9 billion by 2025.
Significant differences which impede the integration of mainframe data into cloud environments include:
Data compression and formats
Data structure definitions
Mainframe operations
As businesses make the transition to cloud, the need for trusted data to make good business decisions is more vital than ever.
Key business initiatives in 2022 revolve around transforming customer experiences, applying AI to proven business cases to derive new insights and increase efficiency, leveraging the power of location to solve new problems (optional light call out to covid-19 problems being solved), and ensuring that the business is secure and compliant.
Each of these initiatives is heavily dependent upon integrated, clean, accurate, contextualized, enriched data in order to deliver the maximum benefit to the organization. (Transition to next slide showing stats on data challenges.)
Unfortunately, those within the organization closest to the data can’t trust it for various reasons. For instance, they…
“Can’t get it fast enough”
Trapped in complex, legacy systems
Not available when & where it’s needed
Not as fresh as business demands
“Don’t understand it”
Don’t understand the lineage of data
Don’t know how it’s used by the business
Don’t have accountability around data changes
“Can’t trust it”
Full of errors
Non-standardized
“Don’t have the context to use it”
Lacking the 3rd party data and location context needed for decision-making
“Don’t know when it’s going to break”
Downtime comes as a surprise
Anomalies unexpectedly impact the business downstream
The true root cause of data problems is unidentified
Fortunately, you are not alone. Every day, we talk to companies like yours who are struggling to meet their data-driven business goals - because their data is a mess.
Source for stats on left side
IDC Spotlight: Improving Data Integrity and Trust Through Transparency and Enrichment
Written by: Stewart Bond, Research Director, Data Integration and Intelligence Software
To achieve your goals, you need what we call data integrity.
Data integrity is data with maximum accuracy, consistency, completeness, and context for confident business decision-making.
Mega-trends, such as the rise of AI-powered cloud technologies and the expanding ranks of executives and managers demanding full and unhindered ‘democratized’ access to data, always force big changes to business systems, at many different levels. Yet if your IT architecture includes mainframe systems, you will always have to stand with one foot in the present and the other in the future.
Choosing solutions that are designed from the start to enable you to easily deal with new realities will ensure you can keep your balance, even as the future becomes ever more fluid and slippery.
The modular, interoperable Precisely Data Integrity Suite contains everything you need to deliver accurate, consistent, contextual data to your business - wherever and whenever it’s needed.
Data Integration: Break down data silos by quickly building modern data pipelines that drive innovation
Data Observability: Proactively uncover data anomalies and act before they become costly downstream issues
Data Governance: Manage data policy and processes with greater insight into your data’s meaning, lineage, and impact
Data Quality: Deliver data that’s accurate, consistent, and fit for purpose across operational and analytical systems
Geo Addressing: Verify, standardize, cleanse, and geocode addresses to unlock valuable context for more informed decision making
Spatial Analytics: Derive and visualize spatial relationships hidden in your data to reveal critical context for better decisions
Data Enrichment: Enrich your business data with expertly curated datasets containing thousands of attributes for faster, confident decisions
There are a handful of principles that we have held to throughout the development of the Suite, and it’s these principles that differentiate it from other solutions in the market.
Modular - Unlike many products in the market, the Data Integrity Suite can be consumed in a modular fashion. Best-of-breed capabilities that deliver great value alone and incremental functionality and value by working together.
Interoperable - It works seamlessly with other modules of the Suite, other products in the Precisely portfolio, and existing technology ecosystems. It is designed to work with traditional and modern technology stacks to maximize return on investment and drive innovation.
Easy - The Data Integrity Suite is designed for business and data teams. The Suite’s modules share a common, business-friendly user interface and no-code experience. The Data Integrity Suite can democratize your data and enable users to serve themselves.
Intelligent - Machine learning intelligence automates and streamlines data integrity processes, leveraged to understand data, identify data anomalies, suggest data quality rules, and more.
Dynamic - And organizations that need more agility in their decision-making and operational processes can access data at the speed of their business. Replication delivers data wherever needed in near-real time; dynamic datasets update as new information becomes available, and potential data issues are proactively flagged before they become costly problems.
The first module is Data Integration. Typically, in any major data initiative, you first need to connect to sources, and sometimes move or replicate data to another environment.
With Data Integration, you can easily create streaming data pipelines that integrate data from core environments such as relational, and of course, mainframe and IBM I, with modern cloud-based data platforms like Snowflake to drive analytics and innovation and extend the value of your mission-critical systems.
We understand that pipelines must scale for your needs today and extend for tomorrow.