SlideShare a Scribd company logo
1 of 42
Keine Angst vorm Dinosaurier:
Mainframe-Integration und -Offloading mit
Confluent & Apache Kafka
Kai Waehner | Technology Evangelist bei Confluent
Marco Kopp | Senior Sales Engineer bei Precisely
Kai Waehner
Technology Evangelist
contact@kai-waehner.de
LinkedIn
@KaiWaehner
www.confluent.io
www.kai-waehner.de
Mainframe Integration, Offloading
and Replacement with Apache
Kafka
Stand up to the Dinosaur!
The Mainframe is here to stay!
“Mainframes are still hard at work,
processing over 70 percent of the world’s
most important computing transactions
every day. Organizations like banks, credit
card companies, airlines, medical facilities,
insurance companies, and others that can
absolutely not afford downtime and errors
depend on the mainframe to get the job
done. Nearly three-quarters of all Fortune
500 companies still turn to the mainframe to
get the critical processing work completed”
https://www.bmc.com/blogs/mainframe-mips-an-introduction/
What is a Mainframe?
Modern mainframe design is characterized less by raw
computational speed and more by:
• High reliability and security
• Extensive input-output ("I/O") facilities with the ability to
offload to separate engines
• Strict backward compatibility with older software
• High hardware and computational utilization rates through
virtualization to support massive throughput
• Hot-swapping of hardware, such as processors and memory
Vendors: “IBM and the Seven Dwarfs”
The IBM z15, announced in 2019,
with up to 40TB RAM and 190 Cores,
typically costs millions $$$
(variable software costs not included)
Neobanks and FinTechs Hunting the Traditional Banks
Monolithic
Proprietary
Complex
Inflexible
MIPS (million instructions per second)
to normalize CPU usage across CPU types and models or hardware configs
MSU (million service units)
hardware and software metrics calculated directly by the operating system
6
… and what about
hiring mainframe
experts?
Huge demand to build an open, flexible, scalable platform
• Real time
• Scalability
• High availability
• Decoupling
• Cost reduction
• Flexibility
• Elasticity
• Standards-based
• Extensibility
• Security
• Infrastructure-independent
• Multi-region / global
STREAM
PROCESSING
Create and store
materialized views
Filter
Analyze in-flight
Time
C C
C
Data in Motion
with Event Streaming
Apache Kafka is an Event Streaming Platform
MES
ERP
Sensors
Mobile
Customer 360
Real-time Alerting
System
Data warehouse
Producers
Consumers
Streams and storage of real time events
Stream
processing
apps
Connectors
Connectors
Stream
processing
apps
Supplier
Alert
Forecast
Inventory Customer
Order
9
Apache Kafka at Scale at Tech Giants
> 7 trillion messages / day > 6 Petabytes / day
“You name it”
* Kafka Is not just used by tech giants
** Kafka is not just used for big data
Business Value per Use Case
Business
Value
Improve
Customer
Experience
(CX)
Increase
Revenue
(make money)
Decrease
Costs
(save money)
Core Business
Platform
Increase
Operational
Efficiency
Migrate to
Cloud
Mitigate Risk
(protect money)
Key Drivers
Strategic Objectives
(sample)
Fraud
Detection
IoT sensor
ingestion
Digital
replatforming/
Mainframe Offload
Connected Car: Navigation & improved
in-car experience: Audi
Customer 360
Simplifying Omni-channel Retail
at Scale: Target
Faster transactional
processing / analysis
incl. Machine Learning / AI
Mainframe Offload: RBC
Microservices
Architecture
Online Fraud Detection
Online Security
(syslog, log
aggregation, Splunk
replacement)
Middleware
replacement
Regulatory
Digital
Transformation
Application Modernization:
Multiple Examples
Website / Core
Operations
(Central Nervous System)
The [Silicon Valley] Digital Natives;
LinkedIn, Netflix, Uber, Yelp...
Predictive Maintenance: Audi
Streaming Platform in a regulated
environment (e.g. Electronic Medical
Records): Celmatix
Real-time app
updates
Real Time Streaming Platform for
Communications and Beyond: Capital One
Developer Velocity - Building Stateful
Financial Applications with Kafka
Streams: Funding Circle
Detect Fraud & Prevent Fraud
in Real Time: PayPal
Kafka as a Service - A Tale of Security
and Multi-Tenancy: Apple
Example Use Cases
$↑
$↓
$↔
https://www.confluent.io/customers/rbc/ “… rescue data off of the mainframe, in a cloud native,
microservice-based fashion … [to] … significantly reduce the reads
on the mainframe, saving RBC fixed infrastructure costs (OPEX).
RBC stayed compliant with bank regulations and business logic,
and is now able to create new applications using the same event-
based architecture.”
Kafka Connect
Kafka Cluster
CRM
Integration
Domain-Driven Design and Decoupled Applications
Legacy
Integration
Custom
Application
Mainframe
Connector
Java / C++ /
Go / Python /
ksqlDB
Schema
Registry
Event Streaming Platform
CRM Domain Legacy Payment Domain Fraud Domain
Audit Logs,
RBAC, etc.
Hybrid and Global Architectures
Aggregate Small Footprint
Edge Deployments with
Replication (Aggregation)
Simplify Disaster Recovery
Operations with
Multi-Region Clusters
with RPO=0 and RTO=0
Stream Data Globally with
Replication and Cluster Linking
Mainframe Offloading
Journey from Mainframe
to Hybrid* and Cloud
PHASE
3
Hybrid
Replication
Mainframe
Replacement
PHASE
2
PHASE
1
* with or without the mainframe
Mainframe Offloading
Database
change
Microservices
events
SaaS
data
Customer
experiences
Streams of real time events
Legacy
App
Modern
App 1
Complex business logic
Push changes once
Write
Write
continuously
Read
continuously
Modern
App 2
Write
continuously
Read
continuously
MIPS / MSU
MIPS / MSU
MIPS / MSU
Read
No MIPS / MSU
Mainframe Replacement
Database
change
Microservices
events
SaaS
data
Customer
experiences
Streams of real time events
Legacy
App
Modern
App 1
Complex business logic
Push changes once
Write
Write
continuously
Read
continuously
Modern
App 2
Write
continuously
Read
continuously
MIPS / MSU
MIPS / MSU
MIPS / MSU
Read
No MIPS / MSU
Why Confluent
@KaiWaehner - www.kai-waehner.de
I N V E S T M E N T & T I M E
V
A
L
U
E
3
4
5
1
2
Event Streaming Maturity Model
Initial Awareness
/
Pilot (1 Kafka
Cluster)
Start to Build
Pipeline / Deliver
1 New Outcome
(1 Kafka Cluster)
Mission-Critical
Deployment
(Stretched,
Hybrid, Multi-
Region)
Build Contextual
Event-Driven Apps
(Stretched,
Hybrid, Multi-
Region)
Central Nervous
System
(Global Kafka)
Product, Support, Training, Partners, Technical Account Management...
19
@KaiWaehner - www.kai-waehner.de
The Rise of Event Streaming
2010
Apache Kafka
created at LinkedIn
by
Confluent founders
2014
2020
80%
Fortune 100
Companies
trust and use
Apache Kafka
20
@KaiWaehner - www.kai-waehner.de
Confluent Platform
Freedom of Choice
Committer-driven Expertise
Open Source | Community licensed
Fully Managed Cloud Service
Self-managed Software
Training Partners
Enterprise
Support
Professional
Services
ARCHITECT
OPERATOR
DEVELOPER EXECUTIVE
Apache Kafka
Dynamic Performance &
Elasticity
Self-Balancing Clusters | Tiered
Storage
Flexible DevOps Automation
Operator | Ansible
GUI-driven Mgmt & Monitoring
Control Center | Proactive Support
Event Streaming Database
ksqlDB
Rich Pre-built Ecosystem
Connectors | Hub | Schema
Registry
Multi-language Development
Non-Java Clients | REST Proxy
Admin REST APIs
Global Resilience
Multi-Region Clusters | Replicator
Cluster Linking
Data Compatibility
Schema Registry | Schema
Validation
Enterprise-grade Security
RBAC | Secrets | Audit Logs
TCO / ROI
Revenue / Cost / Risk Impact
Complete Engagement Model
Efficient Operations
at Scale
Unrestricted
Developer Productivity
Production-stage
Prerequisites
Partnership for
Business Success
Kai Waehner
Technology Evangelist
contact@kai-waehner.de
@KaiWaehner
www.kai-waehner.de
www.confluent.io
LinkedIn
Questions? Feedback?
Let’s connect!
Precisely Connect Overview
Connecting today’s infrastructure with tomorrow’s technology to
unlock the potential of all your enterprise data.
Extract, Transform, Load Data Replication via CDC
High-performance ETL for Apache
Spark, Cloud, Windows, Linux, Unix and
Hadoop MapReduce
Real-time database replication to
streaming platforms, Cloud, databases
and data warehouses
Connect
Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka
24
Connect is the best
solution for accessing
and integrating
mainframe and IBM i
data with cloud
frameworks in real time.
Quickly and efficiently integrate
ALL enterprise data – including
mainframe and IBM i
Design-once, deploy anywhere
approach to sources and targets
in real time
Reduce costs and development
time – from weeks to days
Secure and governed +
unrivaled scalability and
performance
Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka
25
Connect Data Replication Capabilities
Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka
Top Use Cases
DB to DB Replication
Reporting
BI/ETL Integration
Much more!
Migration
Geographical, OS Vendor, DB
Vendor, Product Version
Modernization
Streaming Platforms (Kafka,
Kinesis), Cloud DWs, DBs,
Application Vendor
27
CDC that connects the
enterprise
Single tool supports data replication and
big data convergence
• Replicate data in real-time to feed
applications or analytics
• Real-time replication for hierarchal
formats: Db2/z, IMS, and VSAM
• Power business reporting and insights
• Move only changed data
Connects legacy systems to the cloud
• Build streaming data pipelines from
traditional systems to real-time
applications
• Support hybrid environments
• Keep data lakes fresh
Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit
Confluent & Apache Kafka
28
Business Benefits with Connect CDC
Build Streaming Data Pipelines
• Power’s business decision-making with real-time
data
Get a Consistent View
• Keep your business in sync and consistent
across the enterprise
Migrate Data
• Zero downtime for database/application
upgrades and system re-platforming
Keep Data Lakes fresh
• Ensure real-time updates on transactional
systems, including IBM i, are captured &
replicate
Enable Timely Reporting
• Confidently satisfy every demanding SLAs
.
Resilient Data Delivery
• Support data governance and security
requirements
Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka
29
Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka
Flexible Replication Options
30
One Way Two Way
Cascade
Bi-Directional
Distribute
Consolidate
Choose a topology
or combine them
to meet your data
sharing needs
Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka
Connect CDC
31
Data Streams
Sybase
Strategic Projects:
Real-time analytics, AI
and machine learning
Target
Connect CDC is
cloud platform
enabled for :
Ingest and Stream
RDBMS
Mainframe
IMS
IBM i
Oracle Informix SQL Server
Db2 LUW
Db2 for IBM i
DB2 für z/OS
VSAM
Precisely Connect
Sources
Product Sneak-Peek
Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka
Precisely Connect Portal
33
Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka
Precisely Connect Portal
34
Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka
Precisely Connect Portal
35
Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka
Precisely Connect Portal
36
Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka
Precisely Connect Portal
37
Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka
Confluent Control Center
38
Kafdrop
Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka
39
Questions
Your
Contact
Bernd Stiene
Senior Account Executive
Phone: +49 152 5398 4389
Email: bernd.stiene@precisely.com
Product Sheet / eBook
Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka
41
Thank you

More Related Content

What's hot

The Rise of The Data Scientist
The Rise of The Data ScientistThe Rise of The Data Scientist
The Rise of The Data ScientistPrecisely
 
Using Modern Cloud Technologies to Power Business Processes
Using Modern Cloud Technologies to Power Business ProcessesUsing Modern Cloud Technologies to Power Business Processes
Using Modern Cloud Technologies to Power Business ProcessesPrecisely
 
Case Manager for Content Management - A Customer's Perspective
Case Manager for Content Management - A Customer's PerspectiveCase Manager for Content Management - A Customer's Perspective
Case Manager for Content Management - A Customer's PerspectiveThe Dayhuff Group
 
Information management
Information managementInformation management
Information managementDavid Champeau
 
Optimize your cloud strategy for machine learning and analytics
Optimize your cloud strategy for machine learning and analyticsOptimize your cloud strategy for machine learning and analytics
Optimize your cloud strategy for machine learning and analyticsCloudera, Inc.
 
Harnessing the Power of Advanced Insurance Analytics Through Property Data
Harnessing the Power of Advanced Insurance Analytics Through Property DataHarnessing the Power of Advanced Insurance Analytics Through Property Data
Harnessing the Power of Advanced Insurance Analytics Through Property DataPrecisely
 
Big Data Made Easy: A Simple, Scalable Solution for Getting Started with Hadoop
Big Data Made Easy:  A Simple, Scalable Solution for Getting Started with HadoopBig Data Made Easy:  A Simple, Scalable Solution for Getting Started with Hadoop
Big Data Made Easy: A Simple, Scalable Solution for Getting Started with HadoopPrecisely
 
CONNtext presentation
CONNtext presentationCONNtext presentation
CONNtext presentationArmedia LLC
 
Leveraging the Power of the ServiceNow® Platform with Mainframe and IBM i Sys...
Leveraging the Power of the ServiceNow® Platform with Mainframe and IBM i Sys...Leveraging the Power of the ServiceNow® Platform with Mainframe and IBM i Sys...
Leveraging the Power of the ServiceNow® Platform with Mainframe and IBM i Sys...Precisely
 
Transforming Business for the Digital Age (Presented by Microsoft)
Transforming Business for the Digital Age (Presented by Microsoft)Transforming Business for the Digital Age (Presented by Microsoft)
Transforming Business for the Digital Age (Presented by Microsoft)Cloudera, Inc.
 
The Path to Data and Analytics Modernization
The Path to Data and Analytics ModernizationThe Path to Data and Analytics Modernization
The Path to Data and Analytics ModernizationAnalytics8
 
The Big Picture: Real-time Data is Defining Intelligent Offers
The Big Picture: Real-time Data is Defining Intelligent OffersThe Big Picture: Real-time Data is Defining Intelligent Offers
The Big Picture: Real-time Data is Defining Intelligent OffersCloudera, Inc.
 
MLOps - Getting Machine Learning Into Production
MLOps - Getting Machine Learning Into ProductionMLOps - Getting Machine Learning Into Production
MLOps - Getting Machine Learning Into ProductionMichael Pearce
 
Augmented Analytics and Automation in the Age of the Data Scientist
Augmented Analytics and Automation in the Age of the Data ScientistAugmented Analytics and Automation in the Age of the Data Scientist
Augmented Analytics and Automation in the Age of the Data ScientistWhereScape
 
How to Evaluate Cloud Databases for eCommerce
How to Evaluate Cloud Databases for eCommerceHow to Evaluate Cloud Databases for eCommerce
How to Evaluate Cloud Databases for eCommerceDataStax
 
Rethink Your 2021 Data Management Strategy with Data Virtualization (ASEAN)
Rethink Your 2021 Data Management Strategy with Data Virtualization (ASEAN)Rethink Your 2021 Data Management Strategy with Data Virtualization (ASEAN)
Rethink Your 2021 Data Management Strategy with Data Virtualization (ASEAN)Denodo
 
An initiative to healthcare analytics with office 365 and power bi spsparis2017
An initiative to healthcare analytics with office 365 and power bi spsparis2017An initiative to healthcare analytics with office 365 and power bi spsparis2017
An initiative to healthcare analytics with office 365 and power bi spsparis2017Thuan Ng
 
Sisense Introduction PPT
Sisense Introduction PPTSisense Introduction PPT
Sisense Introduction PPTKhirod Sahu
 
Platforming the Major Analytic Use Cases for Modern Engineering
Platforming the Major Analytic Use Cases for Modern EngineeringPlatforming the Major Analytic Use Cases for Modern Engineering
Platforming the Major Analytic Use Cases for Modern EngineeringDATAVERSITY
 
Effective use of cloud resources for Data Engineering - Johnson Darkwah
Effective use of cloud resources for Data Engineering - Johnson DarkwahEffective use of cloud resources for Data Engineering - Johnson Darkwah
Effective use of cloud resources for Data Engineering - Johnson DarkwahMatěj Jakimov
 

What's hot (20)

The Rise of The Data Scientist
The Rise of The Data ScientistThe Rise of The Data Scientist
The Rise of The Data Scientist
 
Using Modern Cloud Technologies to Power Business Processes
Using Modern Cloud Technologies to Power Business ProcessesUsing Modern Cloud Technologies to Power Business Processes
Using Modern Cloud Technologies to Power Business Processes
 
Case Manager for Content Management - A Customer's Perspective
Case Manager for Content Management - A Customer's PerspectiveCase Manager for Content Management - A Customer's Perspective
Case Manager for Content Management - A Customer's Perspective
 
Information management
Information managementInformation management
Information management
 
Optimize your cloud strategy for machine learning and analytics
Optimize your cloud strategy for machine learning and analyticsOptimize your cloud strategy for machine learning and analytics
Optimize your cloud strategy for machine learning and analytics
 
Harnessing the Power of Advanced Insurance Analytics Through Property Data
Harnessing the Power of Advanced Insurance Analytics Through Property DataHarnessing the Power of Advanced Insurance Analytics Through Property Data
Harnessing the Power of Advanced Insurance Analytics Through Property Data
 
Big Data Made Easy: A Simple, Scalable Solution for Getting Started with Hadoop
Big Data Made Easy:  A Simple, Scalable Solution for Getting Started with HadoopBig Data Made Easy:  A Simple, Scalable Solution for Getting Started with Hadoop
Big Data Made Easy: A Simple, Scalable Solution for Getting Started with Hadoop
 
CONNtext presentation
CONNtext presentationCONNtext presentation
CONNtext presentation
 
Leveraging the Power of the ServiceNow® Platform with Mainframe and IBM i Sys...
Leveraging the Power of the ServiceNow® Platform with Mainframe and IBM i Sys...Leveraging the Power of the ServiceNow® Platform with Mainframe and IBM i Sys...
Leveraging the Power of the ServiceNow® Platform with Mainframe and IBM i Sys...
 
Transforming Business for the Digital Age (Presented by Microsoft)
Transforming Business for the Digital Age (Presented by Microsoft)Transforming Business for the Digital Age (Presented by Microsoft)
Transforming Business for the Digital Age (Presented by Microsoft)
 
The Path to Data and Analytics Modernization
The Path to Data and Analytics ModernizationThe Path to Data and Analytics Modernization
The Path to Data and Analytics Modernization
 
The Big Picture: Real-time Data is Defining Intelligent Offers
The Big Picture: Real-time Data is Defining Intelligent OffersThe Big Picture: Real-time Data is Defining Intelligent Offers
The Big Picture: Real-time Data is Defining Intelligent Offers
 
MLOps - Getting Machine Learning Into Production
MLOps - Getting Machine Learning Into ProductionMLOps - Getting Machine Learning Into Production
MLOps - Getting Machine Learning Into Production
 
Augmented Analytics and Automation in the Age of the Data Scientist
Augmented Analytics and Automation in the Age of the Data ScientistAugmented Analytics and Automation in the Age of the Data Scientist
Augmented Analytics and Automation in the Age of the Data Scientist
 
How to Evaluate Cloud Databases for eCommerce
How to Evaluate Cloud Databases for eCommerceHow to Evaluate Cloud Databases for eCommerce
How to Evaluate Cloud Databases for eCommerce
 
Rethink Your 2021 Data Management Strategy with Data Virtualization (ASEAN)
Rethink Your 2021 Data Management Strategy with Data Virtualization (ASEAN)Rethink Your 2021 Data Management Strategy with Data Virtualization (ASEAN)
Rethink Your 2021 Data Management Strategy with Data Virtualization (ASEAN)
 
An initiative to healthcare analytics with office 365 and power bi spsparis2017
An initiative to healthcare analytics with office 365 and power bi spsparis2017An initiative to healthcare analytics with office 365 and power bi spsparis2017
An initiative to healthcare analytics with office 365 and power bi spsparis2017
 
Sisense Introduction PPT
Sisense Introduction PPTSisense Introduction PPT
Sisense Introduction PPT
 
Platforming the Major Analytic Use Cases for Modern Engineering
Platforming the Major Analytic Use Cases for Modern EngineeringPlatforming the Major Analytic Use Cases for Modern Engineering
Platforming the Major Analytic Use Cases for Modern Engineering
 
Effective use of cloud resources for Data Engineering - Johnson Darkwah
Effective use of cloud resources for Data Engineering - Johnson DarkwahEffective use of cloud resources for Data Engineering - Johnson Darkwah
Effective use of cloud resources for Data Engineering - Johnson Darkwah
 

Similar to Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka

Resilient Real-time Data Streaming across the Edge and Hybrid Cloud with Apac...
Resilient Real-time Data Streaming across the Edge and Hybrid Cloud with Apac...Resilient Real-time Data Streaming across the Edge and Hybrid Cloud with Apac...
Resilient Real-time Data Streaming across the Edge and Hybrid Cloud with Apac...Kai Wähner
 
Fast Data – Fast Cars: Wie Apache Kafka die Datenwelt revolutioniert
Fast Data – Fast Cars: Wie Apache Kafka die Datenwelt revolutioniertFast Data – Fast Cars: Wie Apache Kafka die Datenwelt revolutioniert
Fast Data – Fast Cars: Wie Apache Kafka die Datenwelt revolutioniertconfluent
 
Apache Kafka as Event Streaming Platform for Microservice Architectures
Apache Kafka as Event Streaming Platform for Microservice ArchitecturesApache Kafka as Event Streaming Platform for Microservice Architectures
Apache Kafka as Event Streaming Platform for Microservice ArchitecturesKai Wähner
 
Building Serverless EDA w_ AWS Lambda (1).pptx
Building Serverless EDA w_ AWS Lambda (1).pptxBuilding Serverless EDA w_ AWS Lambda (1).pptx
Building Serverless EDA w_ AWS Lambda (1).pptxAhmed791434
 
Real-time processing of large amounts of data
Real-time processing of large amounts of dataReal-time processing of large amounts of data
Real-time processing of large amounts of dataconfluent
 
Discussing strategies for building the next gen data centre
Discussing strategies for building the next gen data centreDiscussing strategies for building the next gen data centre
Discussing strategies for building the next gen data centreICT-Partners
 
Giga Spaces Getting Ready For The Cloud
Giga Spaces   Getting Ready For The CloudGiga Spaces   Getting Ready For The Cloud
Giga Spaces Getting Ready For The Cloudchzesin
 
GigaSpaces - Getting Ready For The Cloud
GigaSpaces - Getting Ready For The CloudGigaSpaces - Getting Ready For The Cloud
GigaSpaces - Getting Ready For The Cloudgigaspaces
 
Unlock value with Confluent and AWS.pptx
Unlock value with Confluent and AWS.pptxUnlock value with Confluent and AWS.pptx
Unlock value with Confluent and AWS.pptxAhmed791434
 
Best Practices for Building Hybrid-Cloud Architectures | Hans Jespersen
Best Practices for Building Hybrid-Cloud Architectures | Hans JespersenBest Practices for Building Hybrid-Cloud Architectures | Hans Jespersen
Best Practices for Building Hybrid-Cloud Architectures | Hans Jespersenconfluent
 
Apache Kafka® + Machine Learning for Supply Chain 
Apache Kafka® + Machine Learning for Supply Chain Apache Kafka® + Machine Learning for Supply Chain 
Apache Kafka® + Machine Learning for Supply Chain confluent
 
IIoT with Kafka and Machine Learning for Supply Chain Optimization In Real Ti...
IIoT with Kafka and Machine Learning for Supply Chain Optimization In Real Ti...IIoT with Kafka and Machine Learning for Supply Chain Optimization In Real Ti...
IIoT with Kafka and Machine Learning for Supply Chain Optimization In Real Ti...Kai Wähner
 
Conquering Disaster Recovery Challenges and Out-of-Control Data with the Hybr...
Conquering Disaster Recovery Challenges and Out-of-Control Data with the Hybr...Conquering Disaster Recovery Challenges and Out-of-Control Data with the Hybr...
Conquering Disaster Recovery Challenges and Out-of-Control Data with the Hybr...actualtechmedia
 
Apache Kafka vs. Traditional Middleware (Kai Waehner, Confluent) Frankfurt 20...
Apache Kafka vs. Traditional Middleware (Kai Waehner, Confluent) Frankfurt 20...Apache Kafka vs. Traditional Middleware (Kai Waehner, Confluent) Frankfurt 20...
Apache Kafka vs. Traditional Middleware (Kai Waehner, Confluent) Frankfurt 20...confluent
 
Apache Kafka vs. Integration Middleware (MQ, ETL, ESB) - Friends, Enemies or ...
Apache Kafka vs. Integration Middleware (MQ, ETL, ESB) - Friends, Enemies or ...Apache Kafka vs. Integration Middleware (MQ, ETL, ESB) - Friends, Enemies or ...
Apache Kafka vs. Integration Middleware (MQ, ETL, ESB) - Friends, Enemies or ...confluent
 
App modernization on AWS with Apache Kafka and Confluent Cloud
App modernization on AWS with Apache Kafka and Confluent CloudApp modernization on AWS with Apache Kafka and Confluent Cloud
App modernization on AWS with Apache Kafka and Confluent CloudKai Wähner
 
Data Streaming with Apache Kafka & MongoDB
Data Streaming with Apache Kafka & MongoDBData Streaming with Apache Kafka & MongoDB
Data Streaming with Apache Kafka & MongoDBconfluent
 
Excellent slides on the new z13s announced on 16th Feb 2016
Excellent slides on the new z13s announced on 16th Feb 2016Excellent slides on the new z13s announced on 16th Feb 2016
Excellent slides on the new z13s announced on 16th Feb 2016Luigi Tommaseo
 
Mit Streaming die Brücken zum Erfolg bauen
Mit Streaming die Brücken zum Erfolg bauenMit Streaming die Brücken zum Erfolg bauen
Mit Streaming die Brücken zum Erfolg bauenconfluent
 
Data Streaming with Apache Kafka & MongoDB - EMEA
Data Streaming with Apache Kafka & MongoDB - EMEAData Streaming with Apache Kafka & MongoDB - EMEA
Data Streaming with Apache Kafka & MongoDB - EMEAAndrew Morgan
 

Similar to Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka (20)

Resilient Real-time Data Streaming across the Edge and Hybrid Cloud with Apac...
Resilient Real-time Data Streaming across the Edge and Hybrid Cloud with Apac...Resilient Real-time Data Streaming across the Edge and Hybrid Cloud with Apac...
Resilient Real-time Data Streaming across the Edge and Hybrid Cloud with Apac...
 
Fast Data – Fast Cars: Wie Apache Kafka die Datenwelt revolutioniert
Fast Data – Fast Cars: Wie Apache Kafka die Datenwelt revolutioniertFast Data – Fast Cars: Wie Apache Kafka die Datenwelt revolutioniert
Fast Data – Fast Cars: Wie Apache Kafka die Datenwelt revolutioniert
 
Apache Kafka as Event Streaming Platform for Microservice Architectures
Apache Kafka as Event Streaming Platform for Microservice ArchitecturesApache Kafka as Event Streaming Platform for Microservice Architectures
Apache Kafka as Event Streaming Platform for Microservice Architectures
 
Building Serverless EDA w_ AWS Lambda (1).pptx
Building Serverless EDA w_ AWS Lambda (1).pptxBuilding Serverless EDA w_ AWS Lambda (1).pptx
Building Serverless EDA w_ AWS Lambda (1).pptx
 
Real-time processing of large amounts of data
Real-time processing of large amounts of dataReal-time processing of large amounts of data
Real-time processing of large amounts of data
 
Discussing strategies for building the next gen data centre
Discussing strategies for building the next gen data centreDiscussing strategies for building the next gen data centre
Discussing strategies for building the next gen data centre
 
Giga Spaces Getting Ready For The Cloud
Giga Spaces   Getting Ready For The CloudGiga Spaces   Getting Ready For The Cloud
Giga Spaces Getting Ready For The Cloud
 
GigaSpaces - Getting Ready For The Cloud
GigaSpaces - Getting Ready For The CloudGigaSpaces - Getting Ready For The Cloud
GigaSpaces - Getting Ready For The Cloud
 
Unlock value with Confluent and AWS.pptx
Unlock value with Confluent and AWS.pptxUnlock value with Confluent and AWS.pptx
Unlock value with Confluent and AWS.pptx
 
Best Practices for Building Hybrid-Cloud Architectures | Hans Jespersen
Best Practices for Building Hybrid-Cloud Architectures | Hans JespersenBest Practices for Building Hybrid-Cloud Architectures | Hans Jespersen
Best Practices for Building Hybrid-Cloud Architectures | Hans Jespersen
 
Apache Kafka® + Machine Learning for Supply Chain 
Apache Kafka® + Machine Learning for Supply Chain Apache Kafka® + Machine Learning for Supply Chain 
Apache Kafka® + Machine Learning for Supply Chain 
 
IIoT with Kafka and Machine Learning for Supply Chain Optimization In Real Ti...
IIoT with Kafka and Machine Learning for Supply Chain Optimization In Real Ti...IIoT with Kafka and Machine Learning for Supply Chain Optimization In Real Ti...
IIoT with Kafka and Machine Learning for Supply Chain Optimization In Real Ti...
 
Conquering Disaster Recovery Challenges and Out-of-Control Data with the Hybr...
Conquering Disaster Recovery Challenges and Out-of-Control Data with the Hybr...Conquering Disaster Recovery Challenges and Out-of-Control Data with the Hybr...
Conquering Disaster Recovery Challenges and Out-of-Control Data with the Hybr...
 
Apache Kafka vs. Traditional Middleware (Kai Waehner, Confluent) Frankfurt 20...
Apache Kafka vs. Traditional Middleware (Kai Waehner, Confluent) Frankfurt 20...Apache Kafka vs. Traditional Middleware (Kai Waehner, Confluent) Frankfurt 20...
Apache Kafka vs. Traditional Middleware (Kai Waehner, Confluent) Frankfurt 20...
 
Apache Kafka vs. Integration Middleware (MQ, ETL, ESB) - Friends, Enemies or ...
Apache Kafka vs. Integration Middleware (MQ, ETL, ESB) - Friends, Enemies or ...Apache Kafka vs. Integration Middleware (MQ, ETL, ESB) - Friends, Enemies or ...
Apache Kafka vs. Integration Middleware (MQ, ETL, ESB) - Friends, Enemies or ...
 
App modernization on AWS with Apache Kafka and Confluent Cloud
App modernization on AWS with Apache Kafka and Confluent CloudApp modernization on AWS with Apache Kafka and Confluent Cloud
App modernization on AWS with Apache Kafka and Confluent Cloud
 
Data Streaming with Apache Kafka & MongoDB
Data Streaming with Apache Kafka & MongoDBData Streaming with Apache Kafka & MongoDB
Data Streaming with Apache Kafka & MongoDB
 
Excellent slides on the new z13s announced on 16th Feb 2016
Excellent slides on the new z13s announced on 16th Feb 2016Excellent slides on the new z13s announced on 16th Feb 2016
Excellent slides on the new z13s announced on 16th Feb 2016
 
Mit Streaming die Brücken zum Erfolg bauen
Mit Streaming die Brücken zum Erfolg bauenMit Streaming die Brücken zum Erfolg bauen
Mit Streaming die Brücken zum Erfolg bauen
 
Data Streaming with Apache Kafka & MongoDB - EMEA
Data Streaming with Apache Kafka & MongoDB - EMEAData Streaming with Apache Kafka & MongoDB - EMEA
Data Streaming with Apache Kafka & MongoDB - EMEA
 

More from Precisely

Zukuntssichere SAP Prozesse dank automatisierter Massendaten
Zukuntssichere SAP Prozesse dank automatisierter MassendatenZukuntssichere SAP Prozesse dank automatisierter Massendaten
Zukuntssichere SAP Prozesse dank automatisierter MassendatenPrecisely
 
Unlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power SystemsUnlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power SystemsPrecisely
 
Crucial Considerations for AI-ready Data.pdf
Crucial Considerations for AI-ready Data.pdfCrucial Considerations for AI-ready Data.pdf
Crucial Considerations for AI-ready Data.pdfPrecisely
 
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdfHyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdfPrecisely
 
Justifying Capacity Managment Webinar 4/10
Justifying Capacity Managment Webinar 4/10Justifying Capacity Managment Webinar 4/10
Justifying Capacity Managment Webinar 4/10Precisely
 
Automate Studio Training: Materials Maintenance Tips for Efficiency and Ease ...
Automate Studio Training: Materials Maintenance Tips for Efficiency and Ease ...Automate Studio Training: Materials Maintenance Tips for Efficiency and Ease ...
Automate Studio Training: Materials Maintenance Tips for Efficiency and Ease ...Precisely
 
Leveraging Mainframe Data in Near Real Time to Unleash Innovation With Cloud:...
Leveraging Mainframe Data in Near Real Time to Unleash Innovation With Cloud:...Leveraging Mainframe Data in Near Real Time to Unleash Innovation With Cloud:...
Leveraging Mainframe Data in Near Real Time to Unleash Innovation With Cloud:...Precisely
 
Testjrjnejrvnorno4rno3nrfnfjnrfnournfou3nfou3f
Testjrjnejrvnorno4rno3nrfnfjnrfnournfou3nfou3fTestjrjnejrvnorno4rno3nrfnfjnrfnournfou3nfou3f
Testjrjnejrvnorno4rno3nrfnfjnrfnournfou3nfou3fPrecisely
 
Data Innovation Summit: Data Integrity Trends
Data Innovation Summit: Data Integrity TrendsData Innovation Summit: Data Integrity Trends
Data Innovation Summit: Data Integrity TrendsPrecisely
 
AI You Can Trust - Ensuring Success with Data Integrity Webinar
AI You Can Trust - Ensuring Success with Data Integrity WebinarAI You Can Trust - Ensuring Success with Data Integrity Webinar
AI You Can Trust - Ensuring Success with Data Integrity WebinarPrecisely
 
Optimisez la fonction financière en automatisant vos processus SAP
Optimisez la fonction financière en automatisant vos processus SAPOptimisez la fonction financière en automatisant vos processus SAP
Optimisez la fonction financière en automatisant vos processus SAPPrecisely
 
SAPS/4HANA Migration - Transformation-Management + nachhaltige Investitionen
SAPS/4HANA Migration - Transformation-Management + nachhaltige InvestitionenSAPS/4HANA Migration - Transformation-Management + nachhaltige Investitionen
SAPS/4HANA Migration - Transformation-Management + nachhaltige InvestitionenPrecisely
 
Automatisierte SAP Prozesse mit Hilfe von APIs
Automatisierte SAP Prozesse mit Hilfe von APIsAutomatisierte SAP Prozesse mit Hilfe von APIs
Automatisierte SAP Prozesse mit Hilfe von APIsPrecisely
 
Moving IBM i Applications to the Cloud with AWS and Precisely
Moving IBM i Applications to the Cloud with AWS and PreciselyMoving IBM i Applications to the Cloud with AWS and Precisely
Moving IBM i Applications to the Cloud with AWS and PreciselyPrecisely
 
Effective Security Monitoring for IBM i: What You Need to Know
Effective Security Monitoring for IBM i: What You Need to KnowEffective Security Monitoring for IBM i: What You Need to Know
Effective Security Monitoring for IBM i: What You Need to KnowPrecisely
 
Automate Your Master Data Processes for Shared Service Center Excellence
Automate Your Master Data Processes for Shared Service Center ExcellenceAutomate Your Master Data Processes for Shared Service Center Excellence
Automate Your Master Data Processes for Shared Service Center ExcellencePrecisely
 
5 Keys to Improved IT Operation Management
5 Keys to Improved IT Operation Management5 Keys to Improved IT Operation Management
5 Keys to Improved IT Operation ManagementPrecisely
 
Unlock Efficiency With Your Address Data Today For a Smarter Tomorrow
Unlock Efficiency With Your Address Data Today For a Smarter TomorrowUnlock Efficiency With Your Address Data Today For a Smarter Tomorrow
Unlock Efficiency With Your Address Data Today For a Smarter TomorrowPrecisely
 
Navigating Cloud Trends in 2024 Webinar Deck
Navigating Cloud Trends in 2024 Webinar DeckNavigating Cloud Trends in 2024 Webinar Deck
Navigating Cloud Trends in 2024 Webinar DeckPrecisely
 
Mainframe Sort Operations: Gaining the Insights You Need for Peak Performance
Mainframe Sort Operations: Gaining the Insights You Need for Peak PerformanceMainframe Sort Operations: Gaining the Insights You Need for Peak Performance
Mainframe Sort Operations: Gaining the Insights You Need for Peak PerformancePrecisely
 

More from Precisely (20)

Zukuntssichere SAP Prozesse dank automatisierter Massendaten
Zukuntssichere SAP Prozesse dank automatisierter MassendatenZukuntssichere SAP Prozesse dank automatisierter Massendaten
Zukuntssichere SAP Prozesse dank automatisierter Massendaten
 
Unlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power SystemsUnlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power Systems
 
Crucial Considerations for AI-ready Data.pdf
Crucial Considerations for AI-ready Data.pdfCrucial Considerations for AI-ready Data.pdf
Crucial Considerations for AI-ready Data.pdf
 
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdfHyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
 
Justifying Capacity Managment Webinar 4/10
Justifying Capacity Managment Webinar 4/10Justifying Capacity Managment Webinar 4/10
Justifying Capacity Managment Webinar 4/10
 
Automate Studio Training: Materials Maintenance Tips for Efficiency and Ease ...
Automate Studio Training: Materials Maintenance Tips for Efficiency and Ease ...Automate Studio Training: Materials Maintenance Tips for Efficiency and Ease ...
Automate Studio Training: Materials Maintenance Tips for Efficiency and Ease ...
 
Leveraging Mainframe Data in Near Real Time to Unleash Innovation With Cloud:...
Leveraging Mainframe Data in Near Real Time to Unleash Innovation With Cloud:...Leveraging Mainframe Data in Near Real Time to Unleash Innovation With Cloud:...
Leveraging Mainframe Data in Near Real Time to Unleash Innovation With Cloud:...
 
Testjrjnejrvnorno4rno3nrfnfjnrfnournfou3nfou3f
Testjrjnejrvnorno4rno3nrfnfjnrfnournfou3nfou3fTestjrjnejrvnorno4rno3nrfnfjnrfnournfou3nfou3f
Testjrjnejrvnorno4rno3nrfnfjnrfnournfou3nfou3f
 
Data Innovation Summit: Data Integrity Trends
Data Innovation Summit: Data Integrity TrendsData Innovation Summit: Data Integrity Trends
Data Innovation Summit: Data Integrity Trends
 
AI You Can Trust - Ensuring Success with Data Integrity Webinar
AI You Can Trust - Ensuring Success with Data Integrity WebinarAI You Can Trust - Ensuring Success with Data Integrity Webinar
AI You Can Trust - Ensuring Success with Data Integrity Webinar
 
Optimisez la fonction financière en automatisant vos processus SAP
Optimisez la fonction financière en automatisant vos processus SAPOptimisez la fonction financière en automatisant vos processus SAP
Optimisez la fonction financière en automatisant vos processus SAP
 
SAPS/4HANA Migration - Transformation-Management + nachhaltige Investitionen
SAPS/4HANA Migration - Transformation-Management + nachhaltige InvestitionenSAPS/4HANA Migration - Transformation-Management + nachhaltige Investitionen
SAPS/4HANA Migration - Transformation-Management + nachhaltige Investitionen
 
Automatisierte SAP Prozesse mit Hilfe von APIs
Automatisierte SAP Prozesse mit Hilfe von APIsAutomatisierte SAP Prozesse mit Hilfe von APIs
Automatisierte SAP Prozesse mit Hilfe von APIs
 
Moving IBM i Applications to the Cloud with AWS and Precisely
Moving IBM i Applications to the Cloud with AWS and PreciselyMoving IBM i Applications to the Cloud with AWS and Precisely
Moving IBM i Applications to the Cloud with AWS and Precisely
 
Effective Security Monitoring for IBM i: What You Need to Know
Effective Security Monitoring for IBM i: What You Need to KnowEffective Security Monitoring for IBM i: What You Need to Know
Effective Security Monitoring for IBM i: What You Need to Know
 
Automate Your Master Data Processes for Shared Service Center Excellence
Automate Your Master Data Processes for Shared Service Center ExcellenceAutomate Your Master Data Processes for Shared Service Center Excellence
Automate Your Master Data Processes for Shared Service Center Excellence
 
5 Keys to Improved IT Operation Management
5 Keys to Improved IT Operation Management5 Keys to Improved IT Operation Management
5 Keys to Improved IT Operation Management
 
Unlock Efficiency With Your Address Data Today For a Smarter Tomorrow
Unlock Efficiency With Your Address Data Today For a Smarter TomorrowUnlock Efficiency With Your Address Data Today For a Smarter Tomorrow
Unlock Efficiency With Your Address Data Today For a Smarter Tomorrow
 
Navigating Cloud Trends in 2024 Webinar Deck
Navigating Cloud Trends in 2024 Webinar DeckNavigating Cloud Trends in 2024 Webinar Deck
Navigating Cloud Trends in 2024 Webinar Deck
 
Mainframe Sort Operations: Gaining the Insights You Need for Peak Performance
Mainframe Sort Operations: Gaining the Insights You Need for Peak PerformanceMainframe Sort Operations: Gaining the Insights You Need for Peak Performance
Mainframe Sort Operations: Gaining the Insights You Need for Peak Performance
 

Recently uploaded

#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
Pigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions
 
Azure Monitor & Application Insight to monitor Infrastructure & Application
Azure Monitor & Application Insight to monitor Infrastructure & ApplicationAzure Monitor & Application Insight to monitor Infrastructure & Application
Azure Monitor & Application Insight to monitor Infrastructure & ApplicationAndikSusilo4
 
Hyderabad Call Girls Khairatabad ✨ 7001305949 ✨ Cheap Price Your Budget
Hyderabad Call Girls Khairatabad ✨ 7001305949 ✨ Cheap Price Your BudgetHyderabad Call Girls Khairatabad ✨ 7001305949 ✨ Cheap Price Your Budget
Hyderabad Call Girls Khairatabad ✨ 7001305949 ✨ Cheap Price Your BudgetEnjoy Anytime
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 3652toLead Limited
 
Understanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitectureUnderstanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitecturePixlogix Infotech
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machinePadma Pradeep
 
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphSIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphNeo4j
 
Next-generation AAM aircraft unveiled by Supernal, S-A2
Next-generation AAM aircraft unveiled by Supernal, S-A2Next-generation AAM aircraft unveiled by Supernal, S-A2
Next-generation AAM aircraft unveiled by Supernal, S-A2Hyundai Motor Group
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonetsnaman860154
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking MenDelhi Call girls
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Scott Keck-Warren
 
Pigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):comworks
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationSafe Software
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024Scott Keck-Warren
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersThousandEyes
 

Recently uploaded (20)

#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
Pigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping Elbows
 
Azure Monitor & Application Insight to monitor Infrastructure & Application
Azure Monitor & Application Insight to monitor Infrastructure & ApplicationAzure Monitor & Application Insight to monitor Infrastructure & Application
Azure Monitor & Application Insight to monitor Infrastructure & Application
 
Hyderabad Call Girls Khairatabad ✨ 7001305949 ✨ Cheap Price Your Budget
Hyderabad Call Girls Khairatabad ✨ 7001305949 ✨ Cheap Price Your BudgetHyderabad Call Girls Khairatabad ✨ 7001305949 ✨ Cheap Price Your Budget
Hyderabad Call Girls Khairatabad ✨ 7001305949 ✨ Cheap Price Your Budget
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
 
Understanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitectureUnderstanding the Laravel MVC Architecture
Understanding the Laravel MVC Architecture
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machine
 
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphSIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
 
Next-generation AAM aircraft unveiled by Supernal, S-A2
Next-generation AAM aircraft unveiled by Supernal, S-A2Next-generation AAM aircraft unveiled by Supernal, S-A2
Next-generation AAM aircraft unveiled by Supernal, S-A2
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonets
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men
 
Vulnerability_Management_GRC_by Sohang Sengupta.pptx
Vulnerability_Management_GRC_by Sohang Sengupta.pptxVulnerability_Management_GRC_by Sohang Sengupta.pptx
Vulnerability_Management_GRC_by Sohang Sengupta.pptx
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024
 
Pigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food Manufacturing
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptxE-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
 

Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka

  • 1. Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka Kai Waehner | Technology Evangelist bei Confluent Marco Kopp | Senior Sales Engineer bei Precisely
  • 2. Kai Waehner Technology Evangelist contact@kai-waehner.de LinkedIn @KaiWaehner www.confluent.io www.kai-waehner.de Mainframe Integration, Offloading and Replacement with Apache Kafka Stand up to the Dinosaur!
  • 3. The Mainframe is here to stay! “Mainframes are still hard at work, processing over 70 percent of the world’s most important computing transactions every day. Organizations like banks, credit card companies, airlines, medical facilities, insurance companies, and others that can absolutely not afford downtime and errors depend on the mainframe to get the job done. Nearly three-quarters of all Fortune 500 companies still turn to the mainframe to get the critical processing work completed” https://www.bmc.com/blogs/mainframe-mips-an-introduction/
  • 4. What is a Mainframe? Modern mainframe design is characterized less by raw computational speed and more by: • High reliability and security • Extensive input-output ("I/O") facilities with the ability to offload to separate engines • Strict backward compatibility with older software • High hardware and computational utilization rates through virtualization to support massive throughput • Hot-swapping of hardware, such as processors and memory Vendors: “IBM and the Seven Dwarfs” The IBM z15, announced in 2019, with up to 40TB RAM and 190 Cores, typically costs millions $$$ (variable software costs not included)
  • 5. Neobanks and FinTechs Hunting the Traditional Banks Monolithic Proprietary Complex Inflexible
  • 6. MIPS (million instructions per second) to normalize CPU usage across CPU types and models or hardware configs MSU (million service units) hardware and software metrics calculated directly by the operating system 6 … and what about hiring mainframe experts?
  • 7. Huge demand to build an open, flexible, scalable platform • Real time • Scalability • High availability • Decoupling • Cost reduction • Flexibility • Elasticity • Standards-based • Extensibility • Security • Infrastructure-independent • Multi-region / global
  • 8. STREAM PROCESSING Create and store materialized views Filter Analyze in-flight Time C C C Data in Motion with Event Streaming
  • 9. Apache Kafka is an Event Streaming Platform MES ERP Sensors Mobile Customer 360 Real-time Alerting System Data warehouse Producers Consumers Streams and storage of real time events Stream processing apps Connectors Connectors Stream processing apps Supplier Alert Forecast Inventory Customer Order 9
  • 10. Apache Kafka at Scale at Tech Giants > 7 trillion messages / day > 6 Petabytes / day “You name it” * Kafka Is not just used by tech giants ** Kafka is not just used for big data
  • 11. Business Value per Use Case Business Value Improve Customer Experience (CX) Increase Revenue (make money) Decrease Costs (save money) Core Business Platform Increase Operational Efficiency Migrate to Cloud Mitigate Risk (protect money) Key Drivers Strategic Objectives (sample) Fraud Detection IoT sensor ingestion Digital replatforming/ Mainframe Offload Connected Car: Navigation & improved in-car experience: Audi Customer 360 Simplifying Omni-channel Retail at Scale: Target Faster transactional processing / analysis incl. Machine Learning / AI Mainframe Offload: RBC Microservices Architecture Online Fraud Detection Online Security (syslog, log aggregation, Splunk replacement) Middleware replacement Regulatory Digital Transformation Application Modernization: Multiple Examples Website / Core Operations (Central Nervous System) The [Silicon Valley] Digital Natives; LinkedIn, Netflix, Uber, Yelp... Predictive Maintenance: Audi Streaming Platform in a regulated environment (e.g. Electronic Medical Records): Celmatix Real-time app updates Real Time Streaming Platform for Communications and Beyond: Capital One Developer Velocity - Building Stateful Financial Applications with Kafka Streams: Funding Circle Detect Fraud & Prevent Fraud in Real Time: PayPal Kafka as a Service - A Tale of Security and Multi-Tenancy: Apple Example Use Cases $↑ $↓ $↔
  • 12. https://www.confluent.io/customers/rbc/ “… rescue data off of the mainframe, in a cloud native, microservice-based fashion … [to] … significantly reduce the reads on the mainframe, saving RBC fixed infrastructure costs (OPEX). RBC stayed compliant with bank regulations and business logic, and is now able to create new applications using the same event- based architecture.”
  • 13. Kafka Connect Kafka Cluster CRM Integration Domain-Driven Design and Decoupled Applications Legacy Integration Custom Application Mainframe Connector Java / C++ / Go / Python / ksqlDB Schema Registry Event Streaming Platform CRM Domain Legacy Payment Domain Fraud Domain Audit Logs, RBAC, etc.
  • 14. Hybrid and Global Architectures Aggregate Small Footprint Edge Deployments with Replication (Aggregation) Simplify Disaster Recovery Operations with Multi-Region Clusters with RPO=0 and RTO=0 Stream Data Globally with Replication and Cluster Linking
  • 15. Mainframe Offloading Journey from Mainframe to Hybrid* and Cloud PHASE 3 Hybrid Replication Mainframe Replacement PHASE 2 PHASE 1 * with or without the mainframe
  • 16. Mainframe Offloading Database change Microservices events SaaS data Customer experiences Streams of real time events Legacy App Modern App 1 Complex business logic Push changes once Write Write continuously Read continuously Modern App 2 Write continuously Read continuously MIPS / MSU MIPS / MSU MIPS / MSU Read No MIPS / MSU
  • 17. Mainframe Replacement Database change Microservices events SaaS data Customer experiences Streams of real time events Legacy App Modern App 1 Complex business logic Push changes once Write Write continuously Read continuously Modern App 2 Write continuously Read continuously MIPS / MSU MIPS / MSU MIPS / MSU Read No MIPS / MSU
  • 19. @KaiWaehner - www.kai-waehner.de I N V E S T M E N T & T I M E V A L U E 3 4 5 1 2 Event Streaming Maturity Model Initial Awareness / Pilot (1 Kafka Cluster) Start to Build Pipeline / Deliver 1 New Outcome (1 Kafka Cluster) Mission-Critical Deployment (Stretched, Hybrid, Multi- Region) Build Contextual Event-Driven Apps (Stretched, Hybrid, Multi- Region) Central Nervous System (Global Kafka) Product, Support, Training, Partners, Technical Account Management... 19
  • 20. @KaiWaehner - www.kai-waehner.de The Rise of Event Streaming 2010 Apache Kafka created at LinkedIn by Confluent founders 2014 2020 80% Fortune 100 Companies trust and use Apache Kafka 20
  • 21. @KaiWaehner - www.kai-waehner.de Confluent Platform Freedom of Choice Committer-driven Expertise Open Source | Community licensed Fully Managed Cloud Service Self-managed Software Training Partners Enterprise Support Professional Services ARCHITECT OPERATOR DEVELOPER EXECUTIVE Apache Kafka Dynamic Performance & Elasticity Self-Balancing Clusters | Tiered Storage Flexible DevOps Automation Operator | Ansible GUI-driven Mgmt & Monitoring Control Center | Proactive Support Event Streaming Database ksqlDB Rich Pre-built Ecosystem Connectors | Hub | Schema Registry Multi-language Development Non-Java Clients | REST Proxy Admin REST APIs Global Resilience Multi-Region Clusters | Replicator Cluster Linking Data Compatibility Schema Registry | Schema Validation Enterprise-grade Security RBAC | Secrets | Audit Logs TCO / ROI Revenue / Cost / Risk Impact Complete Engagement Model Efficient Operations at Scale Unrestricted Developer Productivity Production-stage Prerequisites Partnership for Business Success
  • 24. Connecting today’s infrastructure with tomorrow’s technology to unlock the potential of all your enterprise data. Extract, Transform, Load Data Replication via CDC High-performance ETL for Apache Spark, Cloud, Windows, Linux, Unix and Hadoop MapReduce Real-time database replication to streaming platforms, Cloud, databases and data warehouses Connect Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka 24
  • 25. Connect is the best solution for accessing and integrating mainframe and IBM i data with cloud frameworks in real time. Quickly and efficiently integrate ALL enterprise data – including mainframe and IBM i Design-once, deploy anywhere approach to sources and targets in real time Reduce costs and development time – from weeks to days Secure and governed + unrivaled scalability and performance Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka 25
  • 26. Connect Data Replication Capabilities
  • 27. Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka Top Use Cases DB to DB Replication Reporting BI/ETL Integration Much more! Migration Geographical, OS Vendor, DB Vendor, Product Version Modernization Streaming Platforms (Kafka, Kinesis), Cloud DWs, DBs, Application Vendor 27
  • 28. CDC that connects the enterprise Single tool supports data replication and big data convergence • Replicate data in real-time to feed applications or analytics • Real-time replication for hierarchal formats: Db2/z, IMS, and VSAM • Power business reporting and insights • Move only changed data Connects legacy systems to the cloud • Build streaming data pipelines from traditional systems to real-time applications • Support hybrid environments • Keep data lakes fresh Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka 28
  • 29. Business Benefits with Connect CDC Build Streaming Data Pipelines • Power’s business decision-making with real-time data Get a Consistent View • Keep your business in sync and consistent across the enterprise Migrate Data • Zero downtime for database/application upgrades and system re-platforming Keep Data Lakes fresh • Ensure real-time updates on transactional systems, including IBM i, are captured & replicate Enable Timely Reporting • Confidently satisfy every demanding SLAs . Resilient Data Delivery • Support data governance and security requirements Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka 29
  • 30. Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka Flexible Replication Options 30 One Way Two Way Cascade Bi-Directional Distribute Consolidate Choose a topology or combine them to meet your data sharing needs
  • 31. Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka Connect CDC 31 Data Streams Sybase Strategic Projects: Real-time analytics, AI and machine learning Target Connect CDC is cloud platform enabled for : Ingest and Stream RDBMS Mainframe IMS IBM i Oracle Informix SQL Server Db2 LUW Db2 for IBM i DB2 für z/OS VSAM Precisely Connect Sources
  • 33. Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka Precisely Connect Portal 33
  • 34. Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka Precisely Connect Portal 34
  • 35. Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka Precisely Connect Portal 35
  • 36. Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka Precisely Connect Portal 36
  • 37. Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka Precisely Connect Portal 37
  • 38. Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka Confluent Control Center 38
  • 39. Kafdrop Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka 39
  • 41. Your Contact Bernd Stiene Senior Account Executive Phone: +49 152 5398 4389 Email: bernd.stiene@precisely.com Product Sheet / eBook Keine Angst vorm Dinosaurier: Mainframe-Integration und -Offloading mit Confluent & Apache Kafka 41

Editor's Notes

  1. Horizontal Value Hierarchy diagram - showing use cases - and some examples… There’s obviously a lot of overlap here. Some use cases span multiple strategic objectives and business drivers - this diagram shows the primary relationship.
  2. Know 5 stages and talking point for each one. There’s a common pattern of how organizations adopt this technology. First, there is initial awareness or a pilot, where an organization is getting to know the technology. This is followed by the initial development of a basic event pipeline, and the delivery at least 1 new business outcome - maybe provisioning a single source of truth for microservices, or offloading data from a mainframe. The third stage involves incorporating and leveraging stream processing. In this stage, an organization is not only collecting and transporting data in real-time, but also processing it for added value. The fourth stage is when an organization starts to build business-transforming contextual event-driven applications. This is a new category of applications - unique to event streaming - where real-time events can be combined with context to deliver powerful, profitable outcomes. The last stage is when event streaming is pervasive and becomes the central nervous system of the enterprise. Examples of this in the consumer world are Netflix and LinkedIn… and in the enterprise world are organizations like Capital One. Confluent accelerates the trajectory of customer journeys to event streaming through its products, support, training, our partner ecosystem and technical account management and services. Let’s talk about you - Where do you see your team on this journey today? How about your LOBs? Your company as a whole? Let’s talk for a few minutes about how we can get you where you need to go.
  3. The rise of Event Streaming can be traced back to 2010, when Apache Kafka was created by the future Confluent founders in Silicon Valley. From there, Kafka began spreading throughout Silicon Valley and across the US West coast. [CLICK] Then, in 2014, Confluent was created with the goal to turn Kafka into an enterprise-ready software stack and cloud offering, after which the adoption of Kafka started to really accelerate. [CLICK] Fast forward to 2020, tens of thousands of companies across the world and across all kinds of industries are using Kafka for event streaming. What I am telling my family and friends is: You are a Kafka user, whether you know it or not. When you use a smartphone, shop online, make a payment, read the news, listen to music, drive a car, book a flight—it’s very likely that this is powered by Kafka behind the scenes. Kafka is applied even to use cases that I personally would have never predicted, like by scientists for research on astrophysics, where Kafka is used for automatically coordinating globally-distributed, large telescopes to record interstellar phenomenons!
  4. What we build is a full, enterprise-ready platform to complete open source Apache Kafka. On top of Kafka, we build a set of features to unleash developer productivity, including the ability to leverage Kafka in languages other than Java, a rich pre-built ecosystem including over 100+ connectors so developers don’t have to spend time building connectors themselves, and enabling stream processing with the ease and familiarity of SQL. Kafka can sometimes be complex and difficult to operate at scale… we make that easy through GUI-based management and monitoring, DevOps automation including with Kubernetes Operator, and enabling dynamic performance and elasticity in deploying Kafka. Also, we offer a set of features many organizations consider as pre-requisites when deploying mission-critical apps on Kafka. These include security features that control who has access to what, the ability to investigate potential security incidents via audit logs, the ability to ensure no ‘dirty’ data in Kafka, and that only ‘clean’ data is in the system through schema validation, and features around resilience, so for example if your data center goes down, your customer-facing applications stay running. We offer all of this with freedom of choice, meaning you can choose self-managed software that you can deploy anywhere, including on-premises, public cloud, private cloud, containers, or Kubernetes. Or you can choose our fully managed cloud service, available on all 3 major cloud providers. And, importantly, underpinning all this is our committer-led expertise. We at Confluent have over X hours of experience with Kafka. We offer support, professional services, training, and a full partner ecosystem. Simply put, there is no other organization in the world better suited to be an enterprise partner, and no organization in the world that is more capable of ensuring your success. This means everything to the organizations we work with.
  5. Aber was ist den nun Precisely Connect genau? Es gibt zwei Varianten unserer Lösung Zum einen bieten wir eine End-To-End-ETL-Lösung für Unternahmen an, die entweder auf einzelnen Servern oder nativ in einer Cluster-Umgebung, wie Hadoop und Spark läuft. Und zum anderen verfügt Connect auch über eine Datenreplikation sprich CDC (=Change Data Capture) Komponente, welche ebenfalls Cloud- und Hybrid-Architekturen unterstützt. Dabei konzentrieren wir uns hauptsächlich auf zwei Anwendungsfälle Auslagern und Verarbeiten von Daten von Mainframe und IBM i Systemen Die zur Verfügungstellung der Daten in moderne Cloud Data Warehäuser Außerdem verfügen wir über konkurrenzlose Möglichkeiten im Bereich ETL Und wir sind Teil an einigen der größten Data Warehouse und Mainframe bzw. IBM i Projekten ---------------------------------------------------------------------------------------------------- There are two flavors of our Connect solution Connect offers enterprise grade end-to-end ETL solution that runs on single servers and runs natively on cluster deployments such as Hadoop and Spark. Connect also has a data replication/CDC solution that supports cloud and hybrid architectures architectures We’re a focused market player and are proud of that. We focus on two use cases: Offload data and/or processing from the mainframe and IBM i Offload data processing from databases and EDWs We have more offload capabilities than any other company on the planet And we are working on some of the biggest data warehouse and mainframe offload projects in the country
  6. … Und wie Sie vermutlich erahnt haben geht das natürlich mit uns. Denn wir können mit Stolz sagen, dass unser Produkt Precisely Connect die beste Lösung für den Zugriff auf und die Integration von Mainframe- und IBM i-Daten ist, sowohl in Batch wie auch in Real-time. Wir können effizient und schnell eine Verbindung zu einer Vielzahl von Plattformen herstellen, völlig egal ob es sich hierbei um Mainframe, Hive, relationale Datenbanken oder andere handelt. Mit Connect entwickeln Sie die Workflows nur einmal und entscheiden im Anschluss auf welcher Umgebung oder Plattform diese ausgeführt werden sollen. Dadurch sind Sie auch für die Zukunft gewappnet und müssen das Rad nicht immer wieder neu erfinden. Wenn Sie also ihre Infrastruktur auf- oder umbauen, von Windows auf eine Big Data Umgebung wechseln, dann müssen Sie lediglich dem Workflow mitteilen, dass es in Zukunft auf der Big Data Umgebung laufen soll und nicht mehr auf dem Windows System. Weitere Anpassungen sind nicht notwendig. Folglich können dadurch Entwicklungszeiten von Wochen auf Tage oder gar Stunden reduziert werden. Und das, weil die Anwender von Connect über eine grafische Oberfläche auf die Workflows zugreifen können und somit schnell und einfach diese in jeder beliebigen Umgebung bereitstellen können. Zu guter Letzt werden Ihre Daten natürlich sicher übertragen und sie sehen die entsprechende Data Lineage, was besonders wichtig ist, wenn die Daten von einem System auf ein anderes System transferiert werden und das alles ist dann noch kombiniert mit unser überragenden Skalierbarkeit und Performanz.
  7. Modernisierung Mainframe nach Streaming mDWH Cloud Traditionelle Datenbank zu Datenbank Replizierung Migration von älteren Datenbank Versionen zu neueren Versionen
  8. Replizierung in Echtzeit DB2/z, IMS, und VSAM Mainframe feste and variable Dateien Cobol Copybooks EBCDIC Gepackte Felder Redefines usw. Nur Delta wird übertragen Hybride Systeme werden unterstützt Data Funnel Daten können transformiert werden
  9. Wir haben viel über das Einlesen komplexer Mainframe-Flatfiles gesprochen Aber wir unterstützen auch all Ihre typischen relationalen Quellen gut Connect CDC bietet Change Data Capture in Echtzeit von praktisch jeder Quelle zu jedem Ziel Unsere größten Anwendungsfälle sind das Ziehen von Daten aus DB2, VSAM, Oracle und SQL Server und das Pushen dieser Änderungen an Kafka Kein Verlust von Daten Automatischer Neustart ---------------------------------------------------------------------------------------------------- We have talked a lot about ingesting complex mainframe flat files But we also support all your typical relational sources well Connect CDC provides real time Change Data Capture from basically any source to any target Our biggest use cases are pulling data from DB2, VSAM, Oracle and SQL Server and pushing those changes to Kafka
  10. Die CDC-Funktionen von Connect sind extrem flexibel und leistungsstark, was die gemeinsame Nutzung von Daten auf Tabellen-, Spalten- und Zeilenebene angeht. Diese Folie veranschaulicht die Flexibilität von Connect auf der höheren, architektonischen Ebene. Es wird eine Vielzahl von Topologien unterstützt, ebenso wie Kombinationen dieser Topologien. ---------------------------------------------------------------------------------------------------- Connect’s CDC capabilities are extremely flexible and powerful with how it does data sharing at the table, column and row level. This slide illustrates the flexibility of Connect at the higher, architectural level. A wide variety of topologies are supported, as are combinations of these topologies.
  11. Kurzfassung: High Level Sicht: Kombination Connect und Databricks Quelldaten: Legacy Systemen, VSAM, IMS, IBM i, DB2 Auch via CDC für Echtzeit Auch Unterstützung via RDBMS, EDW, Flatfiles, XML, JSON, HDFS Streaming nach Kafka Cloud: AWS, MS Azure, GCP Connect nahtlose Integration Databricks Große Datenmengen, transformieren, bereinigen, zusammenführen Laufen direkt innerhalb Databricks Gleich im Delta Lake ablegen Anwender kann sich auf seine Aufgaben konzentrieren, das System erledigt den Rest automatisch Daten im Delta Lake können für diverse Leyer/Schichten genutzt werden Mehrmaliger Prozess, Daten im Delta Lake überschreiben oder ergänzen Nachgelagerte Anwendungen (Bi Reports, BI Analytics, ML) können einfach zugreifen ---------------------------------------------------------------------------------------------------- Hier nun mal eine High Level Sicht auf eine Kombination und Zusammenarbeit zwischen Connect und Databricks. Auf der linken – der Connect – Seite greifen wir ganz einfach auf die unterschiedlichen Quelldaten zu. Und, wie schon vorher erwähnt, können die Daten auch direkt vom Legacy System kommen, sei es in Form von Flatfiles, einer VSAM-Datei, einer IBM i oder normalen DB2 Datenbank. All diese Daten und Formate können wir nicht nur einfach extrahieren sondern auch per CDC, Change Data Capture, darauf zugreifen und in Echtzeit verarbeiten. Natürlich unterstützen wir auch relationale Datenbanken und diverse Enterprise Data Warehäuser und darüber hinaus noch Daten aus Flatfiles, XML-, JSON-Dateien und sogar Daten direkt vom Hadoop Filesystem (HDFS). Des weiteren können wir Streaming-Daten von Kafka verarbeiten. Und dies alles gilt für alle derzeit gängigen Cloud Umgebungen, wie AWS, MS Azure und die Google Cloud Plattform (GCP). Connect selbst lässt sich dabei nahtlos in Databricks integrieren, so dass Databricks für die Dateneingabe genutzt werden kann. Wir können dabei sehr große Datenmengen transformieren, bereinigen und zusammenführen und alles läuft direkt innerhalb von Databricks. Und, da wir direkt mit Databricks intergiert sind, können wir die Daten, mit unseren leistungsfähigen Konnektoren, auch gleich direkt im Delta Lake ablegen. Der Anwender muss sich dabei keine Gedanken machen, wie die Daten letztendlich wirklich verarbeitet und ausgeführt werden. Er kann sich dabei voll und ganz auf die eigentliche Aufgabe und die Anforderungen der Integration selbst konzentrieren. Denn Connect kümmert sich ganz einfach und automatisch um die Daten. Sobald die Daten dann im Delta Lake zur Verfügung stehen können die Anwender eine Vielzahl von Architekturen nutzen und unterschiedliche Schichten/Layer verwalten. Z.B. eine Rohdatenschicht, eine Bronze-, Silber-, Gold- usw. Schicht verwenden und pflegen. Selbstverständlich ist dies nicht nur ein einmaliger Prozesse, wir können jederzeit die Daten im Delta Lake überschreiben und/oder ergänzen, damit diese letztendlich den nachgelagerten Anwendungen, wie BI Reports, BI Analytics oder ML-Anwendungen zur Verfügung stehen. Ähnliche Architekturen und Anwendungsfälle gibt es u.a. auch für die Integration von Daten nach Snowflake ---------------------------------------------------------------------------------------------------- Databricks = Kafka = Snowflake = Amazon Redshift Google BigQuery Now at a high level, connected Databricks work very closely together. So on the Connect side, all we do is we can pull in the data from a variety different platforms or sources. Like I said, if you had data residing on a mainframe, whether that’s in a flat file, a VSAM file, an iOS database, a Db2 database, we have capabilities around data extraction but also data, CDC on that data or Change Data Capture on that data with a minimal footprint, a highest performance engine on the market. We can also connect collect data from a bunch of different relational databases and enterprise data warehouse. We can ingest data from things such as flat files, XML files, JSON or even some sort of system like Hadoop HDFS. We can pull streaming data in from Kafka. And again, deploy on any cloud environment you need or connect to data sitting in any core environments such as AWS, Azure or Google Cloud Platform. Now Connect again like I mentioned before, is natively integrated with Databricks, so we can leverage Databricks to do that data ingest. We can do a large scale data transformation, cleansing, merging all of that done in the Precisely engine running on Databricks. And finally, because we’re natively integrated with Delta Lake, we can deliver that data into Delta Lake with our native high performance connectors. The user never has to worry about how that pipeline they built, is actually gonna be deployed and running on the Databricks platform. Instead, they focus on the business task at hand and the requirements of their integration. Precisely and Connect take care of the rest of it for you. Now again, once that data is served in Delta, there are a variety of different architecture customers can deploy, maintaining raw data layers, bronze layer, silver layers, gold and rich layers. But we can always again, natively load that into Delta and that can serve for the reporting BI and machine learning applications can be built on top of it too.
  12. Frage: Können Sie auch Logdaten vom Mainframe auswerten? Antwort: Ja, mit Hilfe des Produkt Ironstream können Logdaten von Mainframe und/oder IBM i Systemen innerhalb von z.B. Splunk oder ServiceNow zur Verfügung gestellt und dort dann entsprechend ausgewertet werden. Frage: Wird auch Cloudera unterstützt? Antwort: Neben Databricks unterstützen wir auch u.a. Cloudera CDP. Für diese Umgebung sind wir abermals zertifiziert worden, nachdem wir schon für die vorherigen Umgebungen CDH (Cloudera) und HDP (Hortonworks) zertifiziert waren. Dazu kommen dann noch Microsoft Azure, AWS EMR und Google Cloud Plattform um die derzeit gängigsten zu nennen. Was auch die sind welche wir derzeit bei den Unternehmen sehen.