Transform Your
Mainframe Data for the
Cloud with Precisely
and Apache Kafka
Agenda
• Beginning your cloud transformation
journey
• Connecting the mainframe to the cloud
• Ingredients for success of mainframe to the
cloud
• Stark Denmark, a story of transformation
with mainframe, Apache Kafka and the
cloud
Companies typically waste
an average of 35% of their
cloud budget on
inefficiencies.
PwC, 2021
Beginning
your cloud
transformation
journey
Focus on
application
transformation first
Consider
containerization
for hybrid and
multi-cloud
deployments
Understand on-
prem data
pipelines
Determine what
on-prem solutions
need to open to
cloud
Connecting mainframe
to the cloud
Bring rich transaction data to
the cloud
Improve cloud analytics and
insights
Speed delivery of information
Scale with next-generation
initiatives
Ingredients for success of mainframe
to the cloud
Ingredient 1: Log-based data capture
Did you know?
• Connect CDC can leverage
published Log or Journal
standards to identify and
capture the change before
copying to the Share Queue.
• The Connect CDC Queue
ensures that data integrity is
maintained and zero data loss
occurs in the event of a dropped
connection during file
transmission.
1
1
2
3
4
Changed data
Source DBMS
Change selector
Log/Journal Queue Retrieve/Transform/Send
Apply
1. Use of transaction logs or triggers eliminates the
need for invasive actions on the DBMS
2. Selective extracts from the logs and a defined
queue space ensures data integrity
3. Transformation in many cases can be done off
box to reduce impact to production
4. The apply process returns acknowledgement to
queue to complete pseudo two-phase commit
Target DBMS
Ingredient 2: Real-time data
pipelines
• Stream real-time application data from relational database, such as
DB2 for IBM i to mission critical business applications and analytics
platforms
• Other systems examples: mainframes, EDWs
• Business application examples: fraud detection, hotel reservations,
mobile banking, etc.
Ingredient 3: Flexible replication options
One Way Two Way
Cascade
Bi-Directional
Distribute
Consolidate
RDBMS
EDWs
Data Streams
Strategic Projects:
Real-time analytics, AI
and machine learning
Targets
Connect CDC is
cloud platform
enabled for
Ingest and Stream
Ingredient 4:
Precisely Connect
Other
Db2 (IBM i,
z, LUW)
Sybase
Oracle Informix PostgreSQL
MySQL
MS SQL Server
Flat Files
(delimited)
Mainframe
IMS Db2 for z/OS
VSAM
Precisely Connect
Scale to meet the needs of
your business
Self-service data integration through browser-based interface
• Design, deploy and monitor real-time change replication from a variety of traditional
systems (mainframe, IMS, RDBMS, EDW) to next-generation distributed streaming
platforms like Apache Kafka
• Enable the construction of real-time data pipelines
Resilient data delivery
• Fault tolerant – resilient to network/source/target/application server outages
• Protects against loss of data if a connection is temporarily lost
• Keeps track of exactly where data transfer left off and automatically restarts at that
exact point – no manual intervention
• No missing or duplicate data!
Maintain metadata integrity
• Integrates with Kafka schema registry
• On-demand metadata-driven pulls of data from a variety of database systems to next
generation data stores like the cloud and cluster
Customer Story
• Connect leverages transactional CICS events (committed changes)
within the mainframe, replicating the changes to Latitude’s
Confluent-based Kafka event bus
• Connect performs one-way, resilient replication to Confluent Kafka,
maintaining transactional integrity and ensuring proper data
delivery
• Simplified data transformation, COBOL copybook mapping,
REDEFINE handling, and more so that data is intelligible in Kafka
• Improved customer engagement and clearer insight into client
behavior
About
Australian financial services company with
headquarters in Melbourne, Victoria. Core business
is in consumer finance through a variety of services
including unsecured personal loans, credit cards,
car loans, personal insurance and interest free
retail finance. It is the biggest non-bank lender of
consumer credit in Australia.
Problem
Sought to modernize their transaction monitoring
capabilities in order to improve their client
engagement. This goal required gaining real-time
insight into client behavior, so that they could
provide alerting, notifications, offers, and reminders
as events happened. Unlocking this data from
mainframe VSAM was a critical step to achieving
success.
Solution
Precisely Connect
Confluent Kafka
Questions
Transform Your Mainframe Data for the Cloud with Precisely and Apache Kafka

Transform Your Mainframe Data for the Cloud with Precisely and Apache Kafka

  • 1.
    Transform Your Mainframe Datafor the Cloud with Precisely and Apache Kafka
  • 2.
    Agenda • Beginning yourcloud transformation journey • Connecting the mainframe to the cloud • Ingredients for success of mainframe to the cloud • Stark Denmark, a story of transformation with mainframe, Apache Kafka and the cloud
  • 3.
    Companies typically waste anaverage of 35% of their cloud budget on inefficiencies. PwC, 2021
  • 4.
    Beginning your cloud transformation journey Focus on application transformationfirst Consider containerization for hybrid and multi-cloud deployments Understand on- prem data pipelines Determine what on-prem solutions need to open to cloud
  • 5.
    Connecting mainframe to thecloud Bring rich transaction data to the cloud Improve cloud analytics and insights Speed delivery of information Scale with next-generation initiatives
  • 6.
    Ingredients for successof mainframe to the cloud
  • 7.
    Ingredient 1: Log-baseddata capture Did you know? • Connect CDC can leverage published Log or Journal standards to identify and capture the change before copying to the Share Queue. • The Connect CDC Queue ensures that data integrity is maintained and zero data loss occurs in the event of a dropped connection during file transmission. 1 1 2 3 4 Changed data Source DBMS Change selector Log/Journal Queue Retrieve/Transform/Send Apply 1. Use of transaction logs or triggers eliminates the need for invasive actions on the DBMS 2. Selective extracts from the logs and a defined queue space ensures data integrity 3. Transformation in many cases can be done off box to reduce impact to production 4. The apply process returns acknowledgement to queue to complete pseudo two-phase commit Target DBMS
  • 8.
    Ingredient 2: Real-timedata pipelines • Stream real-time application data from relational database, such as DB2 for IBM i to mission critical business applications and analytics platforms • Other systems examples: mainframes, EDWs • Business application examples: fraud detection, hotel reservations, mobile banking, etc.
  • 9.
    Ingredient 3: Flexiblereplication options One Way Two Way Cascade Bi-Directional Distribute Consolidate
  • 10.
    RDBMS EDWs Data Streams Strategic Projects: Real-timeanalytics, AI and machine learning Targets Connect CDC is cloud platform enabled for Ingest and Stream Ingredient 4: Precisely Connect Other Db2 (IBM i, z, LUW) Sybase Oracle Informix PostgreSQL MySQL MS SQL Server Flat Files (delimited) Mainframe IMS Db2 for z/OS VSAM Precisely Connect
  • 11.
    Scale to meetthe needs of your business Self-service data integration through browser-based interface • Design, deploy and monitor real-time change replication from a variety of traditional systems (mainframe, IMS, RDBMS, EDW) to next-generation distributed streaming platforms like Apache Kafka • Enable the construction of real-time data pipelines Resilient data delivery • Fault tolerant – resilient to network/source/target/application server outages • Protects against loss of data if a connection is temporarily lost • Keeps track of exactly where data transfer left off and automatically restarts at that exact point – no manual intervention • No missing or duplicate data! Maintain metadata integrity • Integrates with Kafka schema registry • On-demand metadata-driven pulls of data from a variety of database systems to next generation data stores like the cloud and cluster
  • 12.
    Customer Story • Connectleverages transactional CICS events (committed changes) within the mainframe, replicating the changes to Latitude’s Confluent-based Kafka event bus • Connect performs one-way, resilient replication to Confluent Kafka, maintaining transactional integrity and ensuring proper data delivery • Simplified data transformation, COBOL copybook mapping, REDEFINE handling, and more so that data is intelligible in Kafka • Improved customer engagement and clearer insight into client behavior About Australian financial services company with headquarters in Melbourne, Victoria. Core business is in consumer finance through a variety of services including unsecured personal loans, credit cards, car loans, personal insurance and interest free retail finance. It is the biggest non-bank lender of consumer credit in Australia. Problem Sought to modernize their transaction monitoring capabilities in order to improve their client engagement. This goal required gaining real-time insight into client behavior, so that they could provide alerting, notifications, offers, and reminders as events happened. Unlocking this data from mainframe VSAM was a critical step to achieving success. Solution Precisely Connect Confluent Kafka
  • 13.