SlideShare a Scribd company logo
1 of 74
Download to read offline
From Zero to Hero
with Kafka Connect
@rmoff
A practical guide to becoming l33t with Kafka Connect
a.k.a.
#kafkasummit
@rmoff #kafkasummit
From Zero to Hero with Kafka Connect
What is
Kafka
Connect?
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Sources
Streaming Integration with Kafka Connect
Kafka Brokers
Kafka Connect
syslog
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Streaming Integration with Kafka Connect
Kafka Brokers
Kafka Connect
Amazon S3
Google BigQuery
Sinks
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Streaming Integration with Kafka Connect
Kafka Brokers
Kafka Connect
syslog
Amazon S3
Google BigQuery
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
{
"connector.class":
"io.confluent.connect.jdbc.JdbcSourceConnector",
"connection.url":
"jdbc:mysql://asgard:3306/demo",
"table.whitelist":
"sales,orders,customers"
}
https://docs.confluent.io/current/connect/
Look Ma, No Code!
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Streaming Pipelines
RDBMS
Kafka
Connect
Kafka
Connect
Amazon S3
HDFS
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
KafkaConnect
Writing to data stores from Kafka
App
Data
Store
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Evolve processing from old systems to new
RDBMS
Existing
App
New App
<x>
Kafka
Connect
@rmoff #kafkasummit
From Zero to Hero with Kafka Connect
Demo #1
http://rmoff.dev/ksldn19_demo-code
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
REST API - tricks
http://go.rmoff.net/connector-status
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
REST API - tricks
http://go.rmoff.net/selectively-delete-connectors
(h/t to @madewithtea)
@rmoff #kafkasummit
From Zero to Hero with Kafka Connect
Configuring
Kafka
Connect
Inside the API - connectors, transforms, converters
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Kafka Connect basics
KafkaKafka ConnectSource
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Connectors
KafkaKafka ConnectSource
Connector
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Connectors
"config": {
[...]
"connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector",
"connection.url": "jdbc:postgresql://postgres:5432/",
"topics": "asgard.demo.orders",
}
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Connectors
Connect
Record
Native data
Connector
KafkaKafka ConnectSource
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Converters
Connect
Record
Native data bytes[]
KafkaKafka ConnectSource
Connector Converter
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Serialisation & Schemas
-> Confluent
Schema Registry
Avro Protobuf JSON CSV
https://qconnewyork.com/system/files/presentation-slides/qcon_17_-_schemas_and_apis.pdf
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
The Confluent Schema Registry
Source
Avro
Message
Target
Schema
RegistryAvro
Schema
Kafka
Connect
Kafka
ConnectAvro
Message
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Converters
key.converter=io.confluent.connect.avro.AvroConverter
key.converter.schema.registry.url=http://localhost:8081
value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=http://localhost:8081
Set as a global default per-worker; optionally can be overriden per-connector
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
What about internal converters?
value.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
key.internal.value.converter=org.apache.kafka.connect.json.JsonConverter
value.internal.value.converter=org.apache.kafka.connect.json.JsonConverter
key.internal.value.converter.bork.bork.bork=org.apache.kafka.connect.json.JsonConverter
key.internal.value.please.just.work.converter=org.apache.kafka.connect.json.JsonConverter
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Single Message Transforms
KafkaKafka ConnectSource
Connector
Transform(s)
Converter
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Single Message Transforms
"config": {
[...]
"transforms": "addDateToTopic,labelFooBar",
"transforms.addDateToTopic.type": "org.apache.kafka.connect.transforms.TimestampRouter",
"transforms.addDateToTopic.topic.format": "${topic}-${timestamp}",
"transforms.addDateToTopic.timestamp.format": "YYYYMM",
"transforms.labelFooBar.type": "org.apache.kafka.connect.transforms.ReplaceField$Value",
"transforms.labelFooBar.renames": "delivery_address:shipping_address",
}
Do these transforms
Transforms config Config per transform
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Extensible
Connector
Transform(s)
Converter
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Confluent Hub
hub.confluent.io
@rmoff #kafkasummit
From Zero to Hero with Kafka Connect
Deploying
Kafka
Connect
Connectors, Tasks, and Workers
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Connectors and Tasks
JDBC Source S3 Sink
JDBC Task #2JDBC Task #1
S3 Task #1
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Connectors and Tasks
JDBC Source S3 Sink
JDBC Task #2JDBC Task #1
S3 Task #1
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Connectors and Tasks
JDBC Source S3 Sink
JDBC Task #2JDBC Task #1
S3 Task #1
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Tasks and Workers
JDBC Source S3 Sink
JDBC Task #2JDBC Task #1
S3 Task #1
Worker
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Kafka Connect Standalone Worker
JDBC Task #2JDBC Task #1
S3 Task #1
Worker
Offsets
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
"Scaling" the Standalone Worker
JDBC Task #2
JDBC Task #1
S3 Task #1
Worker
OffsetsOffsets
Worker
Fault-tolerant? Nope.
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
JDBC Task #2
Kafka Connect Distributed Worker
JDBC Task #1 JDBC Task #2
S3 Task #1
Offsets
Config
Status
Fault-tolerant? Yeah!
Worker
Kafka Connect cluster
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Scaling the Distributed Worker
JDBC Task #1 JDBC Task #2
S3 Task #1
Offsets
Config
Status
Fault-tolerant? Yeah!
Worker Worker
Kafka Connect cluster
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Distributed Worker - fault tolerance
JDBC Task #1
S3 Task #1
Offsets
Config
Status
Worker Worker
Kafka Connect cluster
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Distributed Worker - fault tolerance
JDBC Task #1
S3 Task #1
Offsets
Config
Status
Worker
Kafka Connect cluster
JDBC Task #2
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Multiple Distributed Clusters
JDBC Task #1
S3 Task #1
Offsets
Config
Status
Kafka Connect cluster #1
JDBC Task #2
Kafka Connect cluster #2
Offsets
Config
Status
@rmoff #kafkasummit
From Zero to Hero with Kafka Connect
Troubleshooting
Kafka Connect
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Troubleshooting Kafka Connect
$ curl -s "http://localhost:8083/connectors/source-debezium-orders/status" | 
jq '.connector.state'
"RUNNING"
$ curl -s "http://localhost:8083/connectors/source-debezium-orders/status" | 
jq '.tasks[0].state'
"FAILED"
http://go.rmoff.net/connector-status
RUNNING
FAILED
Connector
Task
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Troubleshooting Kafka Connect
curl -s "http:"//localhost:8083/connectors/source-debezium-orders-00/status"
| jq '.tasks[0].trace'
"org.apache.kafka.connect.errors.ConnectExceptionntat
io.debezium.connector.mysql.AbstractReader.wrap(AbstractReader.java:230)ntat
io.debezium.connector.mysql.AbstractReader.failed(AbstractReader.java:197)ntat
io.debezium.connector.mysql.BinlogReader$ReaderThreadLifecycleListener.onCommunicationFailure(BinlogReader.java:
1018)ntat com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:950)ntat
com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:580)ntat
com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:825)ntat java.lang.Thread.run(Thread.java:
748)nCaused by: java.io.EOFExceptionntat
com.github.shyiko.mysql.binlog.io.ByteArrayInputStream.read(ByteArrayInputStream.java:190)ntat
com.github.shyiko.mysql.binlog.io.ByteArrayInputStream.readInteger(ByteArrayInputStream.java:46)ntat
com.github.shyiko.mysql.binlog.event.deserialization.EventHeaderV4Deserializer.deserialize(EventHeaderV4Deserializer.java
:35)ntat
com.github.shyiko.mysql.binlog.event.deserialization.EventHeaderV4Deserializer.deserialize(EventHeaderV4Deserializer.java
:27)ntat com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.nextEvent(EventDeserializer.java:
212)ntat io.debezium.connector.mysql.BinlogReader$1.nextEvent(BinlogReader.java:224)ntat
com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:922)nt""... 3 moren"
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
The log is the source of truth
$ confluent log connect
$ docker-compose logs kafka-connect
$ cat /var/log/kafka/connect.log
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Kafka Connect
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Kafka Connect
Symptom not Cause
Task is being killed and will
not recover until manually restarted
"
"
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Kafka Connect
@rmoff #kafkasummit
From Zero to Hero with Kafka Connect
Common
errors
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
org.apache.kafka.common.errors.SerializationException:
Unknown magic byte!
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Mismatched converters
"value.converter":
"AvroConverter"
org.apache.kafka.common.errors.SerializationException:
Unknown magic byte!
Use the correct Converter for the
source dataⓘ
Messages are not Avro
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Mixed serialisation methods
"value.converter":
"AvroConverter"
org.apache.kafka.common.errors.SerializationException:
Unknown magic byte!
Some messages are not Avro
Use error handling to deal
with bad messagesⓘ
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Error Handling and DLQ
Handled
Convert (read/write from Kafka,
[de]-serialisation)
Transform
Not Handled
Start (Connections to a data
store)
Poll / Put (Read/Write from/to
data store)*
* can be retried by Connect
https://cnfl.io/connect-dlq
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Fail Fast
Kafka Connect
Source topic messages
Sink messages
https://cnfl.io/connect-dlq
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
YOLO ¯_(ツ)_/¯
Kafka Connect
Source topic messages
Sink messages
errors.tolerance=all
https://cnfl.io/connect-dlq
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Dead Letter Queue
Kafka Connect
Source topic messages
Sink messages
Dead
letter
queue
errors.tolerance=all
errors.deadletterqueue.topic.name=my_dlq
https://cnfl.io/connect-dlq
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Re-processing the Dead Letter Queue
Source topic messages
Sink messages
Dead
letter
queue
Kafka Connect (Avro sink)
Kafka Connect (JSON sink)
https://cnfl.io/connect-dlq
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
No fields found using key and value schemas for
table: foo-bar
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
No fields found using key and value schemas for
table: foo-bar
JsonDeserializer with schemas.enable requires
"schema" and "payload" fields and may not contain
additional fields
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
No fields found using key and value schemas for table: foo-bar
JsonDeserializer with schemas.enable requires "schema" and "payload" fields and may not contain additional fields
Schema, where art thou?
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Schema, where art thou?
$ http localhost:8081/subjects[...]
jq '.schema|fromjson'
{
"type": "record",
"name": "Value",
"namespace": "asgard.demo.ORDERS",
"fields": [
{
"name": "id",
"type": "int"
},
{
"name": "order_id",
"type": [
"null",
"int"
],
"default": null
},
{
"name": "customer_id",
"type": [
"null",
"int"
],
"default": null
},
Schema
(From Schema Registry)
Messages
(Avro)
$ kafkacat -b localhost:9092 -C -t mysql-
QF@Land RoverDefender 90SheffieldSwift LL
Parkway(2019-05-09T13:42:28Z(2019-05-09T1
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
No fields found using key and value schemas for table: foo-bar
Schema, where art thou?
No schema!
Messages
(JSON)
{
"id": 7,
"order_id": 7,
"customer_id": 849,
"order_total_usd": 98062.21,
"make": "Pontiac",
"model": "Aztek",
"delivery_city": "Leeds",
"delivery_company": "Price-Zieme"
"delivery_address": "754 Becker W
"CREATE_TS": "2019-05-09T13:42:28
"UPDATE_TS": "2019-05-09T13:42:28
}
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
No fields found using key and value schemas for table: foo-bar
Schema, where art thou?
-> You need a schema!
-> Use Avro, or use JSON with schemas.enable=true
Either way, you need to re-configure the producer of the data
Or, use KSQL to impose a schema on the JSON/CSV data and
reserialise it to Avro !
https://www.confluent.io/blog/data-wrangling-apache-kafka-ksql
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
JsonDeserializer with schemas.enable requires "schema" and "payload" fields and may not contain additional fields
Schema, where art thou?
Messages
(JSON)
Schema
(Embedded per
JSON message)
{
"schema": {
"type": "struct",
"fields": [
{
"type": "int32",
"optional": false,
"field": "id"
},
[!!...]
],
"optional": true,
"name": "asgard.demo.OR
},
"payload": {
"id": 611,
"order_id": 111,
"customer_id": 851,
"order_total_usd": 1821
"make": "Kia",
"model": "Sorento",
"delivery_city": "Swind
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
JsonDeserializer with schemas.enable requires "schema" and
"payload" fields and may not contain additional fields
Schema, where art thou?
-> Your JSON must be structured as
expected.
{
"schema": {
"type": "struct",
"fields": [
{
"type": "int32",
"optional": false,
"field": "id"
},
[!!...]
],
"optional": true,
"name": "asgard.demo.ORDERS
},
"payload": {
"id": 611,
"order_id": 111,
"customer_id": 851,
"order_total_usd": 182190.9
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Using KSQL to apply schema to your data
JSON
Avro
Kafka
Connect
KSQL
CREATE STREAM ORDERS_JSON
(id INT,
order_total_usd DOUBLE,
delivery_city VARCHAR)
WITH (KAFKA_TOPIC='orders-json',
VALUE_FORMAT='JSON');
CREATE STREAM ORDERS_AVRO WITH (
VALUE_FORMAT='AVRO',
KAFKA_TOPIC='orders-avro')
AS SELECT * FROM ORDERS_JSON;
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
JSON
Avro
Kafka
Connect
KSQL
CREATE STREAM ORDERS_JSON
(id INT,
order_total_usd DOUBLE,
delivery_city VARCHAR)
WITH (KAFKA_TOPIC='orders-json',
VALUE_FORMAT='JSON');
CREATE STREAM ORDERS_AVRO WITH (
VALUE_FORMAT='AVRO',
KAFKA_TOPIC='orders-avro')
AS SELECT * FROM ORDERS_JSON;
Using KSQL to apply schema to your data
@rmoff #kafkasummit
From Zero to Hero with Kafka Connect
Containers
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Kafka Connect images on Docker Hub
confluentinc/cp-kafka-connect-base
kafka-connect-elasticsearch
kafka-connect-jdbc
kafka-connect-hdfs
[…]
confluentinc/cp-kafka-connect
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Adding connectors to a container
confluentinc/cp-kafka-connect-base
JAR
Confluent Hub
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
At runtime
JAR
confluentinc/cp-kafka-connect-base
kafka-connect:
image: confluentinc/cp-kafka-connect:5.2.1
environment:
CONNECT_PLUGIN_PATH: '/usr/share/java,/usr/share/confluent-hub-components'
command:
- bash
- -c
- |
confluent-hub install --no-prompt neo4j/kafka-connect-neo4j:1.0.0
/etc/confluent/docker/run
http://rmoff.dev/ksln19-connect-docker
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Build a new image
FROM confluentinc/cp-kafka-connect:5.2.1
ENV CONNECT_PLUGIN_PATH="/usr/share/java,/usr/share/confluent-hub-components"
RUN confluent-hub install --no-prompt neo4j/kafka-connect-neo4j:1.0.0
JAR
confluentinc/cp-kafka-connect-base
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Automating connector creation
# # Download JDBC drivers
cd /usr/share/java/kafka-connect-jdbc/
curl https:!//cdn.mysql.com/Downloads/Connector-J/mysql-connector-java-8.0.13.tar.gz | tar xz
#
# Now launch Kafka Connect
/etc/confluent/docker/run &
#
# Wait for Kafka Connect listener
while [ $$(curl -s -o /dev/null -w %{http_code} http:!//$$CONNECT_REST_ADVERTISED_HOST_NAME:$…
echo -e $$(date) " Kafka Connect listener HTTP state: " $$(curl -s -o /dev/null -w %{http_…
sleep 5
done
#
# Create JDBC Source connector
curl -X POST http:!//localhost:8083/connectors -H "Content-Type: application/json" -d '{
"name": "jdbc_source_mysql_00",
"config": {
"connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector",
"connection.url": "jdbc:mysql:!//mysql:3306/demo",
"connection.user": "connect_user",
"connection.password": "asgard",
"topic.prefix": "mysql-00-",
"table.whitelist" : "demo.customers",
}
}'
# Don't let the container die
sleep infinity http://rmoff.dev/ksln19-connect-docker
#EOF
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
Confluent Community Components
Apache Kafka with a bunch of cool stuff! For free!
Database Changes Log Events loT Data Web Events …
CRM
Data Warehouse
Database
Hadoop
Data

Integration
…
Monitoring
Analytics
Custom Apps
Transformations
Real-time Applications
…
Confluent Platform
Confluent Platform
Apache Kafka®
Core | Connect API | Streams API
Data Compatibility
Schema Registry
Monitoring & Administration
Confluent Control Center | Security
Operations
Replicator | Auto Data Balancing
Development and Connectivity
Clients | Connectors | REST Proxy | CLI
SQL Stream Processing
KSQL
Datacenter Public Cloud Confluent Cloud
CONFLUENT FULLY-MANAGEDCUSTOMER SELF-MANAGED
From Zero to Hero with Kafka Connect
@rmoff #kafkasummit
http://cnfl.io/book-bundle
@rmoff #kafkasummit
From Zero to Hero with Kafka Connecthttp://cnfl.io/slack
@rmoff #kafkasummit
Please rate my session in
the Kafka Summit app :-)

More Related Content

What's hot

An Introduction to Apache Kafka
An Introduction to Apache KafkaAn Introduction to Apache Kafka
An Introduction to Apache KafkaAmir Sedighi
 
Kafka Streams: What it is, and how to use it?
Kafka Streams: What it is, and how to use it?Kafka Streams: What it is, and how to use it?
Kafka Streams: What it is, and how to use it?confluent
 
Distributed stream processing with Apache Kafka
Distributed stream processing with Apache KafkaDistributed stream processing with Apache Kafka
Distributed stream processing with Apache Kafkaconfluent
 
Building Microservices with Apache Kafka
Building Microservices with Apache KafkaBuilding Microservices with Apache Kafka
Building Microservices with Apache Kafkaconfluent
 
Introduction to Kafka Streams
Introduction to Kafka StreamsIntroduction to Kafka Streams
Introduction to Kafka StreamsGuozhang Wang
 
Stream processing using Kafka
Stream processing using KafkaStream processing using Kafka
Stream processing using KafkaKnoldus Inc.
 
Data integration with Apache Kafka
Data integration with Apache KafkaData integration with Apache Kafka
Data integration with Apache Kafkaconfluent
 
ksqlDB: A Stream-Relational Database System
ksqlDB: A Stream-Relational Database SystemksqlDB: A Stream-Relational Database System
ksqlDB: A Stream-Relational Database Systemconfluent
 
Apache Kafka® Security Overview
Apache Kafka® Security OverviewApache Kafka® Security Overview
Apache Kafka® Security Overviewconfluent
 
Data Pipelines with Kafka Connect
Data Pipelines with Kafka ConnectData Pipelines with Kafka Connect
Data Pipelines with Kafka ConnectKaufman Ng
 
Stream Processing with Apache Kafka and .NET
Stream Processing with Apache Kafka and .NETStream Processing with Apache Kafka and .NET
Stream Processing with Apache Kafka and .NETconfluent
 
Fundamentals of Apache Kafka
Fundamentals of Apache KafkaFundamentals of Apache Kafka
Fundamentals of Apache KafkaChhavi Parasher
 
Kafka Connect - debezium
Kafka Connect - debeziumKafka Connect - debezium
Kafka Connect - debeziumKasun Don
 
Can Apache Kafka Replace a Database?
Can Apache Kafka Replace a Database?Can Apache Kafka Replace a Database?
Can Apache Kafka Replace a Database?Kai Wähner
 
Kafka Streams for Java enthusiasts
Kafka Streams for Java enthusiastsKafka Streams for Java enthusiasts
Kafka Streams for Java enthusiastsSlim Baltagi
 
Apache Kafka - Martin Podval
Apache Kafka - Martin PodvalApache Kafka - Martin Podval
Apache Kafka - Martin PodvalMartin Podval
 
Capture the Streams of Database Changes
Capture the Streams of Database ChangesCapture the Streams of Database Changes
Capture the Streams of Database Changesconfluent
 

What's hot (20)

An Introduction to Apache Kafka
An Introduction to Apache KafkaAn Introduction to Apache Kafka
An Introduction to Apache Kafka
 
Kafka Streams: What it is, and how to use it?
Kafka Streams: What it is, and how to use it?Kafka Streams: What it is, and how to use it?
Kafka Streams: What it is, and how to use it?
 
Distributed stream processing with Apache Kafka
Distributed stream processing with Apache KafkaDistributed stream processing with Apache Kafka
Distributed stream processing with Apache Kafka
 
Building Microservices with Apache Kafka
Building Microservices with Apache KafkaBuilding Microservices with Apache Kafka
Building Microservices with Apache Kafka
 
Introduction to Kafka Streams
Introduction to Kafka StreamsIntroduction to Kafka Streams
Introduction to Kafka Streams
 
Stream processing using Kafka
Stream processing using KafkaStream processing using Kafka
Stream processing using Kafka
 
Data integration with Apache Kafka
Data integration with Apache KafkaData integration with Apache Kafka
Data integration with Apache Kafka
 
Apache kafka
Apache kafkaApache kafka
Apache kafka
 
ksqlDB: A Stream-Relational Database System
ksqlDB: A Stream-Relational Database SystemksqlDB: A Stream-Relational Database System
ksqlDB: A Stream-Relational Database System
 
Introduction to Apache Kafka
Introduction to Apache KafkaIntroduction to Apache Kafka
Introduction to Apache Kafka
 
Apache Kafka® Security Overview
Apache Kafka® Security OverviewApache Kafka® Security Overview
Apache Kafka® Security Overview
 
Data Pipelines with Kafka Connect
Data Pipelines with Kafka ConnectData Pipelines with Kafka Connect
Data Pipelines with Kafka Connect
 
Stream Processing with Apache Kafka and .NET
Stream Processing with Apache Kafka and .NETStream Processing with Apache Kafka and .NET
Stream Processing with Apache Kafka and .NET
 
Fundamentals of Apache Kafka
Fundamentals of Apache KafkaFundamentals of Apache Kafka
Fundamentals of Apache Kafka
 
Kafka Connect - debezium
Kafka Connect - debeziumKafka Connect - debezium
Kafka Connect - debezium
 
Can Apache Kafka Replace a Database?
Can Apache Kafka Replace a Database?Can Apache Kafka Replace a Database?
Can Apache Kafka Replace a Database?
 
Apache kafka
Apache kafkaApache kafka
Apache kafka
 
Kafka Streams for Java enthusiasts
Kafka Streams for Java enthusiastsKafka Streams for Java enthusiasts
Kafka Streams for Java enthusiasts
 
Apache Kafka - Martin Podval
Apache Kafka - Martin PodvalApache Kafka - Martin Podval
Apache Kafka - Martin Podval
 
Capture the Streams of Database Changes
Capture the Streams of Database ChangesCapture the Streams of Database Changes
Capture the Streams of Database Changes
 

Similar to From Zero to Hero with Kafka Connect (Robin Moffat, Confluent) Kafka Summit London 2019

From Zero to Hero with Kafka Connect
From Zero to Hero with Kafka ConnectFrom Zero to Hero with Kafka Connect
From Zero to Hero with Kafka ConnectDatabricks
 
From Zero to Hero with Kafka Connect
From Zero to Hero with Kafka ConnectFrom Zero to Hero with Kafka Connect
From Zero to Hero with Kafka Connectconfluent
 
How to build 1000 microservices with Kafka and thrive
How to build 1000 microservices with Kafka and thriveHow to build 1000 microservices with Kafka and thrive
How to build 1000 microservices with Kafka and thriveNatan Silnitsky
 
Apache Kafka - Scalable Message-Processing and more !
Apache Kafka - Scalable Message-Processing and more !Apache Kafka - Scalable Message-Processing and more !
Apache Kafka - Scalable Message-Processing and more !Guido Schmutz
 
What is Apache Kafka®?
What is Apache Kafka®?What is Apache Kafka®?
What is Apache Kafka®?Eventador
 
What is apache Kafka?
What is apache Kafka?What is apache Kafka?
What is apache Kafka?Kenny Gorman
 
Apache kafka meet_up_zurich_at_swissre_from_zero_to_hero_with_kafka_connect_2...
Apache kafka meet_up_zurich_at_swissre_from_zero_to_hero_with_kafka_connect_2...Apache kafka meet_up_zurich_at_swissre_from_zero_to_hero_with_kafka_connect_2...
Apache kafka meet_up_zurich_at_swissre_from_zero_to_hero_with_kafka_connect_2...confluent
 
Apache Kafka and KSQL in Action: Let's Build a Streaming Data Pipeline!
Apache Kafka and KSQL in Action: Let's Build a Streaming Data Pipeline!Apache Kafka and KSQL in Action: Let's Build a Streaming Data Pipeline!
Apache Kafka and KSQL in Action: Let's Build a Streaming Data Pipeline!confluent
 
Build Real-Time Streaming ETL Pipelines With Akka Streams, Alpakka And Apache...
Build Real-Time Streaming ETL Pipelines With Akka Streams, Alpakka And Apache...Build Real-Time Streaming ETL Pipelines With Akka Streams, Alpakka And Apache...
Build Real-Time Streaming ETL Pipelines With Akka Streams, Alpakka And Apache...Lightbend
 
Camel Kafka Connectors: Tune Kafka to “Speak” with (Almost) Everything (Andre...
Camel Kafka Connectors: Tune Kafka to “Speak” with (Almost) Everything (Andre...Camel Kafka Connectors: Tune Kafka to “Speak” with (Almost) Everything (Andre...
Camel Kafka Connectors: Tune Kafka to “Speak” with (Almost) Everything (Andre...HostedbyConfluent
 
Parallelize R Code Using Apache Spark
Parallelize R Code Using Apache Spark Parallelize R Code Using Apache Spark
Parallelize R Code Using Apache Spark Databricks
 
Developing a custom Kafka connector? Make it shine! | Igor Buzatović, Porsche...
Developing a custom Kafka connector? Make it shine! | Igor Buzatović, Porsche...Developing a custom Kafka connector? Make it shine! | Igor Buzatović, Porsche...
Developing a custom Kafka connector? Make it shine! | Igor Buzatović, Porsche...HostedbyConfluent
 
Applying ML on your Data in Motion with AWS and Confluent | Joseph Morais, Co...
Applying ML on your Data in Motion with AWS and Confluent | Joseph Morais, Co...Applying ML on your Data in Motion with AWS and Confluent | Joseph Morais, Co...
Applying ML on your Data in Motion with AWS and Confluent | Joseph Morais, Co...HostedbyConfluent
 
Service messaging using Kafka
Service messaging using KafkaService messaging using Kafka
Service messaging using KafkaRobert Vadai
 
Cross the streams thanks to Kafka and Flink (Christophe Philemotte, Digazu) K...
Cross the streams thanks to Kafka and Flink (Christophe Philemotte, Digazu) K...Cross the streams thanks to Kafka and Flink (Christophe Philemotte, Digazu) K...
Cross the streams thanks to Kafka and Flink (Christophe Philemotte, Digazu) K...confluent
 
Apache Gobblin: Bridging Batch and Streaming Data Integration. Big Data Meetu...
Apache Gobblin: Bridging Batch and Streaming Data Integration. Big Data Meetu...Apache Gobblin: Bridging Batch and Streaming Data Integration. Big Data Meetu...
Apache Gobblin: Bridging Batch and Streaming Data Integration. Big Data Meetu...Shirshanka Das
 
AKKA and Scala @ Inneractive
AKKA and Scala @ InneractiveAKKA and Scala @ Inneractive
AKKA and Scala @ InneractiveGal Aviv
 
What is Apache Kafka and What is an Event Streaming Platform?
What is Apache Kafka and What is an Event Streaming Platform?What is Apache Kafka and What is an Event Streaming Platform?
What is Apache Kafka and What is an Event Streaming Platform?confluent
 

Similar to From Zero to Hero with Kafka Connect (Robin Moffat, Confluent) Kafka Summit London 2019 (20)

From Zero to Hero with Kafka Connect
From Zero to Hero with Kafka ConnectFrom Zero to Hero with Kafka Connect
From Zero to Hero with Kafka Connect
 
From Zero to Hero with Kafka Connect
From Zero to Hero with Kafka ConnectFrom Zero to Hero with Kafka Connect
From Zero to Hero with Kafka Connect
 
Spark streaming + kafka 0.10
Spark streaming + kafka 0.10Spark streaming + kafka 0.10
Spark streaming + kafka 0.10
 
Training
TrainingTraining
Training
 
How to build 1000 microservices with Kafka and thrive
How to build 1000 microservices with Kafka and thriveHow to build 1000 microservices with Kafka and thrive
How to build 1000 microservices with Kafka and thrive
 
Apache Kafka - Scalable Message-Processing and more !
Apache Kafka - Scalable Message-Processing and more !Apache Kafka - Scalable Message-Processing and more !
Apache Kafka - Scalable Message-Processing and more !
 
What is Apache Kafka®?
What is Apache Kafka®?What is Apache Kafka®?
What is Apache Kafka®?
 
What is apache Kafka?
What is apache Kafka?What is apache Kafka?
What is apache Kafka?
 
Apache kafka meet_up_zurich_at_swissre_from_zero_to_hero_with_kafka_connect_2...
Apache kafka meet_up_zurich_at_swissre_from_zero_to_hero_with_kafka_connect_2...Apache kafka meet_up_zurich_at_swissre_from_zero_to_hero_with_kafka_connect_2...
Apache kafka meet_up_zurich_at_swissre_from_zero_to_hero_with_kafka_connect_2...
 
Apache Kafka and KSQL in Action: Let's Build a Streaming Data Pipeline!
Apache Kafka and KSQL in Action: Let's Build a Streaming Data Pipeline!Apache Kafka and KSQL in Action: Let's Build a Streaming Data Pipeline!
Apache Kafka and KSQL in Action: Let's Build a Streaming Data Pipeline!
 
Build Real-Time Streaming ETL Pipelines With Akka Streams, Alpakka And Apache...
Build Real-Time Streaming ETL Pipelines With Akka Streams, Alpakka And Apache...Build Real-Time Streaming ETL Pipelines With Akka Streams, Alpakka And Apache...
Build Real-Time Streaming ETL Pipelines With Akka Streams, Alpakka And Apache...
 
Camel Kafka Connectors: Tune Kafka to “Speak” with (Almost) Everything (Andre...
Camel Kafka Connectors: Tune Kafka to “Speak” with (Almost) Everything (Andre...Camel Kafka Connectors: Tune Kafka to “Speak” with (Almost) Everything (Andre...
Camel Kafka Connectors: Tune Kafka to “Speak” with (Almost) Everything (Andre...
 
Parallelize R Code Using Apache Spark
Parallelize R Code Using Apache Spark Parallelize R Code Using Apache Spark
Parallelize R Code Using Apache Spark
 
Developing a custom Kafka connector? Make it shine! | Igor Buzatović, Porsche...
Developing a custom Kafka connector? Make it shine! | Igor Buzatović, Porsche...Developing a custom Kafka connector? Make it shine! | Igor Buzatović, Porsche...
Developing a custom Kafka connector? Make it shine! | Igor Buzatović, Porsche...
 
Applying ML on your Data in Motion with AWS and Confluent | Joseph Morais, Co...
Applying ML on your Data in Motion with AWS and Confluent | Joseph Morais, Co...Applying ML on your Data in Motion with AWS and Confluent | Joseph Morais, Co...
Applying ML on your Data in Motion with AWS and Confluent | Joseph Morais, Co...
 
Service messaging using Kafka
Service messaging using KafkaService messaging using Kafka
Service messaging using Kafka
 
Cross the streams thanks to Kafka and Flink (Christophe Philemotte, Digazu) K...
Cross the streams thanks to Kafka and Flink (Christophe Philemotte, Digazu) K...Cross the streams thanks to Kafka and Flink (Christophe Philemotte, Digazu) K...
Cross the streams thanks to Kafka and Flink (Christophe Philemotte, Digazu) K...
 
Apache Gobblin: Bridging Batch and Streaming Data Integration. Big Data Meetu...
Apache Gobblin: Bridging Batch and Streaming Data Integration. Big Data Meetu...Apache Gobblin: Bridging Batch and Streaming Data Integration. Big Data Meetu...
Apache Gobblin: Bridging Batch and Streaming Data Integration. Big Data Meetu...
 
AKKA and Scala @ Inneractive
AKKA and Scala @ InneractiveAKKA and Scala @ Inneractive
AKKA and Scala @ Inneractive
 
What is Apache Kafka and What is an Event Streaming Platform?
What is Apache Kafka and What is an Event Streaming Platform?What is Apache Kafka and What is an Event Streaming Platform?
What is Apache Kafka and What is an Event Streaming Platform?
 

More from confluent

Catch the Wave: SAP Event-Driven and Data Streaming for the Intelligence Ente...
Catch the Wave: SAP Event-Driven and Data Streaming for the Intelligence Ente...Catch the Wave: SAP Event-Driven and Data Streaming for the Intelligence Ente...
Catch the Wave: SAP Event-Driven and Data Streaming for the Intelligence Ente...confluent
 
Santander Stream Processing with Apache Flink
Santander Stream Processing with Apache FlinkSantander Stream Processing with Apache Flink
Santander Stream Processing with Apache Flinkconfluent
 
Unlocking the Power of IoT: A comprehensive approach to real-time insights
Unlocking the Power of IoT: A comprehensive approach to real-time insightsUnlocking the Power of IoT: A comprehensive approach to real-time insights
Unlocking the Power of IoT: A comprehensive approach to real-time insightsconfluent
 
Workshop híbrido: Stream Processing con Flink
Workshop híbrido: Stream Processing con FlinkWorkshop híbrido: Stream Processing con Flink
Workshop híbrido: Stream Processing con Flinkconfluent
 
Industry 4.0: Building the Unified Namespace with Confluent, HiveMQ and Spark...
Industry 4.0: Building the Unified Namespace with Confluent, HiveMQ and Spark...Industry 4.0: Building the Unified Namespace with Confluent, HiveMQ and Spark...
Industry 4.0: Building the Unified Namespace with Confluent, HiveMQ and Spark...confluent
 
AWS Immersion Day Mapfre - Confluent
AWS Immersion Day Mapfre   -   ConfluentAWS Immersion Day Mapfre   -   Confluent
AWS Immersion Day Mapfre - Confluentconfluent
 
Eventos y Microservicios - Santander TechTalk
Eventos y Microservicios - Santander TechTalkEventos y Microservicios - Santander TechTalk
Eventos y Microservicios - Santander TechTalkconfluent
 
Q&A with Confluent Experts: Navigating Networking in Confluent Cloud
Q&A with Confluent Experts: Navigating Networking in Confluent CloudQ&A with Confluent Experts: Navigating Networking in Confluent Cloud
Q&A with Confluent Experts: Navigating Networking in Confluent Cloudconfluent
 
Citi TechTalk Session 2: Kafka Deep Dive
Citi TechTalk Session 2: Kafka Deep DiveCiti TechTalk Session 2: Kafka Deep Dive
Citi TechTalk Session 2: Kafka Deep Diveconfluent
 
Build real-time streaming data pipelines to AWS with Confluent
Build real-time streaming data pipelines to AWS with ConfluentBuild real-time streaming data pipelines to AWS with Confluent
Build real-time streaming data pipelines to AWS with Confluentconfluent
 
Q&A with Confluent Professional Services: Confluent Service Mesh
Q&A with Confluent Professional Services: Confluent Service MeshQ&A with Confluent Professional Services: Confluent Service Mesh
Q&A with Confluent Professional Services: Confluent Service Meshconfluent
 
Citi Tech Talk: Event Driven Kafka Microservices
Citi Tech Talk: Event Driven Kafka MicroservicesCiti Tech Talk: Event Driven Kafka Microservices
Citi Tech Talk: Event Driven Kafka Microservicesconfluent
 
Confluent & GSI Webinars series - Session 3
Confluent & GSI Webinars series - Session 3Confluent & GSI Webinars series - Session 3
Confluent & GSI Webinars series - Session 3confluent
 
Citi Tech Talk: Messaging Modernization
Citi Tech Talk: Messaging ModernizationCiti Tech Talk: Messaging Modernization
Citi Tech Talk: Messaging Modernizationconfluent
 
Citi Tech Talk: Data Governance for streaming and real time data
Citi Tech Talk: Data Governance for streaming and real time dataCiti Tech Talk: Data Governance for streaming and real time data
Citi Tech Talk: Data Governance for streaming and real time dataconfluent
 
Confluent & GSI Webinars series: Session 2
Confluent & GSI Webinars series: Session 2Confluent & GSI Webinars series: Session 2
Confluent & GSI Webinars series: Session 2confluent
 
Data In Motion Paris 2023
Data In Motion Paris 2023Data In Motion Paris 2023
Data In Motion Paris 2023confluent
 
Confluent Partner Tech Talk with Synthesis
Confluent Partner Tech Talk with SynthesisConfluent Partner Tech Talk with Synthesis
Confluent Partner Tech Talk with Synthesisconfluent
 
The Future of Application Development - API Days - Melbourne 2023
The Future of Application Development - API Days - Melbourne 2023The Future of Application Development - API Days - Melbourne 2023
The Future of Application Development - API Days - Melbourne 2023confluent
 
The Playful Bond Between REST And Data Streams
The Playful Bond Between REST And Data StreamsThe Playful Bond Between REST And Data Streams
The Playful Bond Between REST And Data Streamsconfluent
 

More from confluent (20)

Catch the Wave: SAP Event-Driven and Data Streaming for the Intelligence Ente...
Catch the Wave: SAP Event-Driven and Data Streaming for the Intelligence Ente...Catch the Wave: SAP Event-Driven and Data Streaming for the Intelligence Ente...
Catch the Wave: SAP Event-Driven and Data Streaming for the Intelligence Ente...
 
Santander Stream Processing with Apache Flink
Santander Stream Processing with Apache FlinkSantander Stream Processing with Apache Flink
Santander Stream Processing with Apache Flink
 
Unlocking the Power of IoT: A comprehensive approach to real-time insights
Unlocking the Power of IoT: A comprehensive approach to real-time insightsUnlocking the Power of IoT: A comprehensive approach to real-time insights
Unlocking the Power of IoT: A comprehensive approach to real-time insights
 
Workshop híbrido: Stream Processing con Flink
Workshop híbrido: Stream Processing con FlinkWorkshop híbrido: Stream Processing con Flink
Workshop híbrido: Stream Processing con Flink
 
Industry 4.0: Building the Unified Namespace with Confluent, HiveMQ and Spark...
Industry 4.0: Building the Unified Namespace with Confluent, HiveMQ and Spark...Industry 4.0: Building the Unified Namespace with Confluent, HiveMQ and Spark...
Industry 4.0: Building the Unified Namespace with Confluent, HiveMQ and Spark...
 
AWS Immersion Day Mapfre - Confluent
AWS Immersion Day Mapfre   -   ConfluentAWS Immersion Day Mapfre   -   Confluent
AWS Immersion Day Mapfre - Confluent
 
Eventos y Microservicios - Santander TechTalk
Eventos y Microservicios - Santander TechTalkEventos y Microservicios - Santander TechTalk
Eventos y Microservicios - Santander TechTalk
 
Q&A with Confluent Experts: Navigating Networking in Confluent Cloud
Q&A with Confluent Experts: Navigating Networking in Confluent CloudQ&A with Confluent Experts: Navigating Networking in Confluent Cloud
Q&A with Confluent Experts: Navigating Networking in Confluent Cloud
 
Citi TechTalk Session 2: Kafka Deep Dive
Citi TechTalk Session 2: Kafka Deep DiveCiti TechTalk Session 2: Kafka Deep Dive
Citi TechTalk Session 2: Kafka Deep Dive
 
Build real-time streaming data pipelines to AWS with Confluent
Build real-time streaming data pipelines to AWS with ConfluentBuild real-time streaming data pipelines to AWS with Confluent
Build real-time streaming data pipelines to AWS with Confluent
 
Q&A with Confluent Professional Services: Confluent Service Mesh
Q&A with Confluent Professional Services: Confluent Service MeshQ&A with Confluent Professional Services: Confluent Service Mesh
Q&A with Confluent Professional Services: Confluent Service Mesh
 
Citi Tech Talk: Event Driven Kafka Microservices
Citi Tech Talk: Event Driven Kafka MicroservicesCiti Tech Talk: Event Driven Kafka Microservices
Citi Tech Talk: Event Driven Kafka Microservices
 
Confluent & GSI Webinars series - Session 3
Confluent & GSI Webinars series - Session 3Confluent & GSI Webinars series - Session 3
Confluent & GSI Webinars series - Session 3
 
Citi Tech Talk: Messaging Modernization
Citi Tech Talk: Messaging ModernizationCiti Tech Talk: Messaging Modernization
Citi Tech Talk: Messaging Modernization
 
Citi Tech Talk: Data Governance for streaming and real time data
Citi Tech Talk: Data Governance for streaming and real time dataCiti Tech Talk: Data Governance for streaming and real time data
Citi Tech Talk: Data Governance for streaming and real time data
 
Confluent & GSI Webinars series: Session 2
Confluent & GSI Webinars series: Session 2Confluent & GSI Webinars series: Session 2
Confluent & GSI Webinars series: Session 2
 
Data In Motion Paris 2023
Data In Motion Paris 2023Data In Motion Paris 2023
Data In Motion Paris 2023
 
Confluent Partner Tech Talk with Synthesis
Confluent Partner Tech Talk with SynthesisConfluent Partner Tech Talk with Synthesis
Confluent Partner Tech Talk with Synthesis
 
The Future of Application Development - API Days - Melbourne 2023
The Future of Application Development - API Days - Melbourne 2023The Future of Application Development - API Days - Melbourne 2023
The Future of Application Development - API Days - Melbourne 2023
 
The Playful Bond Between REST And Data Streams
The Playful Bond Between REST And Data StreamsThe Playful Bond Between REST And Data Streams
The Playful Bond Between REST And Data Streams
 

Recently uploaded

My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024The Digital Insurer
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Enterprise Knowledge
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr LapshynFwdays
 
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphSIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphNeo4j
 
costume and set research powerpoint presentation
costume and set research powerpoint presentationcostume and set research powerpoint presentation
costume and set research powerpoint presentationphoebematthew05
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 3652toLead Limited
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationSafe Software
 
Unlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power SystemsUnlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power SystemsPrecisely
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersThousandEyes
 
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxMaking_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxnull - The Open Security Community
 
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024BookNet Canada
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsRizwan Syed
 
Pigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationSlibray Presentation
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Patryk Bandurski
 
Understanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitectureUnderstanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitecturePixlogix Infotech
 

Recently uploaded (20)

My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
 
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptxE-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
 
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphSIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
 
DMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special EditionDMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special Edition
 
costume and set research powerpoint presentation
costume and set research powerpoint presentationcostume and set research powerpoint presentation
costume and set research powerpoint presentation
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
 
Unlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power SystemsUnlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power Systems
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
 
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxMaking_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
 
The transition to renewables in India.pdf
The transition to renewables in India.pdfThe transition to renewables in India.pdf
The transition to renewables in India.pdf
 
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL Certs
 
Pigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food Manufacturing
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck Presentation
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
 
Understanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitectureUnderstanding the Laravel MVC Architecture
Understanding the Laravel MVC Architecture
 

From Zero to Hero with Kafka Connect (Robin Moffat, Confluent) Kafka Summit London 2019

  • 1. From Zero to Hero with Kafka Connect @rmoff A practical guide to becoming l33t with Kafka Connect a.k.a. #kafkasummit
  • 2. @rmoff #kafkasummit From Zero to Hero with Kafka Connect What is Kafka Connect?
  • 3. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Sources Streaming Integration with Kafka Connect Kafka Brokers Kafka Connect syslog
  • 4. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Streaming Integration with Kafka Connect Kafka Brokers Kafka Connect Amazon S3 Google BigQuery Sinks
  • 5. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Streaming Integration with Kafka Connect Kafka Brokers Kafka Connect syslog Amazon S3 Google BigQuery
  • 6. From Zero to Hero with Kafka Connect @rmoff #kafkasummit { "connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector", "connection.url": "jdbc:mysql://asgard:3306/demo", "table.whitelist": "sales,orders,customers" } https://docs.confluent.io/current/connect/ Look Ma, No Code!
  • 7. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Streaming Pipelines RDBMS Kafka Connect Kafka Connect Amazon S3 HDFS
  • 8. From Zero to Hero with Kafka Connect @rmoff #kafkasummit KafkaConnect Writing to data stores from Kafka App Data Store
  • 9. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Evolve processing from old systems to new RDBMS Existing App New App <x> Kafka Connect
  • 10. @rmoff #kafkasummit From Zero to Hero with Kafka Connect Demo #1 http://rmoff.dev/ksldn19_demo-code
  • 11. From Zero to Hero with Kafka Connect @rmoff #kafkasummit REST API - tricks http://go.rmoff.net/connector-status
  • 12. From Zero to Hero with Kafka Connect @rmoff #kafkasummit REST API - tricks http://go.rmoff.net/selectively-delete-connectors (h/t to @madewithtea)
  • 13. @rmoff #kafkasummit From Zero to Hero with Kafka Connect Configuring Kafka Connect Inside the API - connectors, transforms, converters
  • 14. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Kafka Connect basics KafkaKafka ConnectSource
  • 15. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Connectors KafkaKafka ConnectSource Connector
  • 16. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Connectors "config": { [...] "connector.class": "io.confluent.connect.jdbc.JdbcSinkConnector", "connection.url": "jdbc:postgresql://postgres:5432/", "topics": "asgard.demo.orders", }
  • 17. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Connectors Connect Record Native data Connector KafkaKafka ConnectSource
  • 18. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Converters Connect Record Native data bytes[] KafkaKafka ConnectSource Connector Converter
  • 19. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Serialisation & Schemas -> Confluent Schema Registry Avro Protobuf JSON CSV https://qconnewyork.com/system/files/presentation-slides/qcon_17_-_schemas_and_apis.pdf
  • 20. From Zero to Hero with Kafka Connect @rmoff #kafkasummit The Confluent Schema Registry Source Avro Message Target Schema RegistryAvro Schema Kafka Connect Kafka ConnectAvro Message
  • 21. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Converters key.converter=io.confluent.connect.avro.AvroConverter key.converter.schema.registry.url=http://localhost:8081 value.converter=io.confluent.connect.avro.AvroConverter value.converter.schema.registry.url=http://localhost:8081 Set as a global default per-worker; optionally can be overriden per-connector
  • 22. From Zero to Hero with Kafka Connect @rmoff #kafkasummit What about internal converters? value.converter=org.apache.kafka.connect.json.JsonConverter internal.value.converter=org.apache.kafka.connect.json.JsonConverter key.internal.value.converter=org.apache.kafka.connect.json.JsonConverter value.internal.value.converter=org.apache.kafka.connect.json.JsonConverter key.internal.value.converter.bork.bork.bork=org.apache.kafka.connect.json.JsonConverter key.internal.value.please.just.work.converter=org.apache.kafka.connect.json.JsonConverter
  • 23. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Single Message Transforms KafkaKafka ConnectSource Connector Transform(s) Converter
  • 24. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Single Message Transforms "config": { [...] "transforms": "addDateToTopic,labelFooBar", "transforms.addDateToTopic.type": "org.apache.kafka.connect.transforms.TimestampRouter", "transforms.addDateToTopic.topic.format": "${topic}-${timestamp}", "transforms.addDateToTopic.timestamp.format": "YYYYMM", "transforms.labelFooBar.type": "org.apache.kafka.connect.transforms.ReplaceField$Value", "transforms.labelFooBar.renames": "delivery_address:shipping_address", } Do these transforms Transforms config Config per transform
  • 25. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Extensible Connector Transform(s) Converter
  • 26. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Confluent Hub hub.confluent.io
  • 27. @rmoff #kafkasummit From Zero to Hero with Kafka Connect Deploying Kafka Connect Connectors, Tasks, and Workers
  • 28. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Connectors and Tasks JDBC Source S3 Sink JDBC Task #2JDBC Task #1 S3 Task #1
  • 29. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Connectors and Tasks JDBC Source S3 Sink JDBC Task #2JDBC Task #1 S3 Task #1
  • 30. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Connectors and Tasks JDBC Source S3 Sink JDBC Task #2JDBC Task #1 S3 Task #1
  • 31. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Tasks and Workers JDBC Source S3 Sink JDBC Task #2JDBC Task #1 S3 Task #1 Worker
  • 32. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Kafka Connect Standalone Worker JDBC Task #2JDBC Task #1 S3 Task #1 Worker Offsets
  • 33. From Zero to Hero with Kafka Connect @rmoff #kafkasummit "Scaling" the Standalone Worker JDBC Task #2 JDBC Task #1 S3 Task #1 Worker OffsetsOffsets Worker Fault-tolerant? Nope.
  • 34. From Zero to Hero with Kafka Connect @rmoff #kafkasummit JDBC Task #2 Kafka Connect Distributed Worker JDBC Task #1 JDBC Task #2 S3 Task #1 Offsets Config Status Fault-tolerant? Yeah! Worker Kafka Connect cluster
  • 35. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Scaling the Distributed Worker JDBC Task #1 JDBC Task #2 S3 Task #1 Offsets Config Status Fault-tolerant? Yeah! Worker Worker Kafka Connect cluster
  • 36. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Distributed Worker - fault tolerance JDBC Task #1 S3 Task #1 Offsets Config Status Worker Worker Kafka Connect cluster
  • 37. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Distributed Worker - fault tolerance JDBC Task #1 S3 Task #1 Offsets Config Status Worker Kafka Connect cluster JDBC Task #2
  • 38. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Multiple Distributed Clusters JDBC Task #1 S3 Task #1 Offsets Config Status Kafka Connect cluster #1 JDBC Task #2 Kafka Connect cluster #2 Offsets Config Status
  • 39. @rmoff #kafkasummit From Zero to Hero with Kafka Connect Troubleshooting Kafka Connect
  • 40. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Troubleshooting Kafka Connect $ curl -s "http://localhost:8083/connectors/source-debezium-orders/status" | jq '.connector.state' "RUNNING" $ curl -s "http://localhost:8083/connectors/source-debezium-orders/status" | jq '.tasks[0].state' "FAILED" http://go.rmoff.net/connector-status RUNNING FAILED Connector Task
  • 41. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Troubleshooting Kafka Connect curl -s "http:"//localhost:8083/connectors/source-debezium-orders-00/status" | jq '.tasks[0].trace' "org.apache.kafka.connect.errors.ConnectExceptionntat io.debezium.connector.mysql.AbstractReader.wrap(AbstractReader.java:230)ntat io.debezium.connector.mysql.AbstractReader.failed(AbstractReader.java:197)ntat io.debezium.connector.mysql.BinlogReader$ReaderThreadLifecycleListener.onCommunicationFailure(BinlogReader.java: 1018)ntat com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:950)ntat com.github.shyiko.mysql.binlog.BinaryLogClient.connect(BinaryLogClient.java:580)ntat com.github.shyiko.mysql.binlog.BinaryLogClient$7.run(BinaryLogClient.java:825)ntat java.lang.Thread.run(Thread.java: 748)nCaused by: java.io.EOFExceptionntat com.github.shyiko.mysql.binlog.io.ByteArrayInputStream.read(ByteArrayInputStream.java:190)ntat com.github.shyiko.mysql.binlog.io.ByteArrayInputStream.readInteger(ByteArrayInputStream.java:46)ntat com.github.shyiko.mysql.binlog.event.deserialization.EventHeaderV4Deserializer.deserialize(EventHeaderV4Deserializer.java :35)ntat com.github.shyiko.mysql.binlog.event.deserialization.EventHeaderV4Deserializer.deserialize(EventHeaderV4Deserializer.java :27)ntat com.github.shyiko.mysql.binlog.event.deserialization.EventDeserializer.nextEvent(EventDeserializer.java: 212)ntat io.debezium.connector.mysql.BinlogReader$1.nextEvent(BinlogReader.java:224)ntat com.github.shyiko.mysql.binlog.BinaryLogClient.listenForEventPackets(BinaryLogClient.java:922)nt""... 3 moren"
  • 42. From Zero to Hero with Kafka Connect @rmoff #kafkasummit The log is the source of truth $ confluent log connect $ docker-compose logs kafka-connect $ cat /var/log/kafka/connect.log
  • 43. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Kafka Connect
  • 44. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Kafka Connect Symptom not Cause Task is being killed and will not recover until manually restarted " "
  • 45. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Kafka Connect
  • 46. @rmoff #kafkasummit From Zero to Hero with Kafka Connect Common errors
  • 47. From Zero to Hero with Kafka Connect @rmoff #kafkasummit org.apache.kafka.common.errors.SerializationException: Unknown magic byte!
  • 48. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Mismatched converters "value.converter": "AvroConverter" org.apache.kafka.common.errors.SerializationException: Unknown magic byte! Use the correct Converter for the source dataⓘ Messages are not Avro
  • 49. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Mixed serialisation methods "value.converter": "AvroConverter" org.apache.kafka.common.errors.SerializationException: Unknown magic byte! Some messages are not Avro Use error handling to deal with bad messagesⓘ
  • 50. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Error Handling and DLQ Handled Convert (read/write from Kafka, [de]-serialisation) Transform Not Handled Start (Connections to a data store) Poll / Put (Read/Write from/to data store)* * can be retried by Connect https://cnfl.io/connect-dlq
  • 51. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Fail Fast Kafka Connect Source topic messages Sink messages https://cnfl.io/connect-dlq
  • 52. From Zero to Hero with Kafka Connect @rmoff #kafkasummit YOLO ¯_(ツ)_/¯ Kafka Connect Source topic messages Sink messages errors.tolerance=all https://cnfl.io/connect-dlq
  • 53. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Dead Letter Queue Kafka Connect Source topic messages Sink messages Dead letter queue errors.tolerance=all errors.deadletterqueue.topic.name=my_dlq https://cnfl.io/connect-dlq
  • 54. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Re-processing the Dead Letter Queue Source topic messages Sink messages Dead letter queue Kafka Connect (Avro sink) Kafka Connect (JSON sink) https://cnfl.io/connect-dlq
  • 55. From Zero to Hero with Kafka Connect @rmoff #kafkasummit No fields found using key and value schemas for table: foo-bar
  • 56. From Zero to Hero with Kafka Connect @rmoff #kafkasummit No fields found using key and value schemas for table: foo-bar JsonDeserializer with schemas.enable requires "schema" and "payload" fields and may not contain additional fields
  • 57. From Zero to Hero with Kafka Connect @rmoff #kafkasummit No fields found using key and value schemas for table: foo-bar JsonDeserializer with schemas.enable requires "schema" and "payload" fields and may not contain additional fields Schema, where art thou?
  • 58. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Schema, where art thou? $ http localhost:8081/subjects[...] jq '.schema|fromjson' { "type": "record", "name": "Value", "namespace": "asgard.demo.ORDERS", "fields": [ { "name": "id", "type": "int" }, { "name": "order_id", "type": [ "null", "int" ], "default": null }, { "name": "customer_id", "type": [ "null", "int" ], "default": null }, Schema (From Schema Registry) Messages (Avro) $ kafkacat -b localhost:9092 -C -t mysql- QF@Land RoverDefender 90SheffieldSwift LL Parkway(2019-05-09T13:42:28Z(2019-05-09T1
  • 59. From Zero to Hero with Kafka Connect @rmoff #kafkasummit No fields found using key and value schemas for table: foo-bar Schema, where art thou? No schema! Messages (JSON) { "id": 7, "order_id": 7, "customer_id": 849, "order_total_usd": 98062.21, "make": "Pontiac", "model": "Aztek", "delivery_city": "Leeds", "delivery_company": "Price-Zieme" "delivery_address": "754 Becker W "CREATE_TS": "2019-05-09T13:42:28 "UPDATE_TS": "2019-05-09T13:42:28 }
  • 60. From Zero to Hero with Kafka Connect @rmoff #kafkasummit No fields found using key and value schemas for table: foo-bar Schema, where art thou? -> You need a schema! -> Use Avro, or use JSON with schemas.enable=true Either way, you need to re-configure the producer of the data Or, use KSQL to impose a schema on the JSON/CSV data and reserialise it to Avro ! https://www.confluent.io/blog/data-wrangling-apache-kafka-ksql
  • 61. From Zero to Hero with Kafka Connect @rmoff #kafkasummit JsonDeserializer with schemas.enable requires "schema" and "payload" fields and may not contain additional fields Schema, where art thou? Messages (JSON) Schema (Embedded per JSON message) { "schema": { "type": "struct", "fields": [ { "type": "int32", "optional": false, "field": "id" }, [!!...] ], "optional": true, "name": "asgard.demo.OR }, "payload": { "id": 611, "order_id": 111, "customer_id": 851, "order_total_usd": 1821 "make": "Kia", "model": "Sorento", "delivery_city": "Swind
  • 62. From Zero to Hero with Kafka Connect @rmoff #kafkasummit JsonDeserializer with schemas.enable requires "schema" and "payload" fields and may not contain additional fields Schema, where art thou? -> Your JSON must be structured as expected. { "schema": { "type": "struct", "fields": [ { "type": "int32", "optional": false, "field": "id" }, [!!...] ], "optional": true, "name": "asgard.demo.ORDERS }, "payload": { "id": 611, "order_id": 111, "customer_id": 851, "order_total_usd": 182190.9
  • 63. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Using KSQL to apply schema to your data JSON Avro Kafka Connect KSQL CREATE STREAM ORDERS_JSON (id INT, order_total_usd DOUBLE, delivery_city VARCHAR) WITH (KAFKA_TOPIC='orders-json', VALUE_FORMAT='JSON'); CREATE STREAM ORDERS_AVRO WITH ( VALUE_FORMAT='AVRO', KAFKA_TOPIC='orders-avro') AS SELECT * FROM ORDERS_JSON;
  • 64. From Zero to Hero with Kafka Connect @rmoff #kafkasummit JSON Avro Kafka Connect KSQL CREATE STREAM ORDERS_JSON (id INT, order_total_usd DOUBLE, delivery_city VARCHAR) WITH (KAFKA_TOPIC='orders-json', VALUE_FORMAT='JSON'); CREATE STREAM ORDERS_AVRO WITH ( VALUE_FORMAT='AVRO', KAFKA_TOPIC='orders-avro') AS SELECT * FROM ORDERS_JSON; Using KSQL to apply schema to your data
  • 65. @rmoff #kafkasummit From Zero to Hero with Kafka Connect Containers
  • 66. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Kafka Connect images on Docker Hub confluentinc/cp-kafka-connect-base kafka-connect-elasticsearch kafka-connect-jdbc kafka-connect-hdfs […] confluentinc/cp-kafka-connect
  • 67. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Adding connectors to a container confluentinc/cp-kafka-connect-base JAR Confluent Hub
  • 68. From Zero to Hero with Kafka Connect @rmoff #kafkasummit At runtime JAR confluentinc/cp-kafka-connect-base kafka-connect: image: confluentinc/cp-kafka-connect:5.2.1 environment: CONNECT_PLUGIN_PATH: '/usr/share/java,/usr/share/confluent-hub-components' command: - bash - -c - | confluent-hub install --no-prompt neo4j/kafka-connect-neo4j:1.0.0 /etc/confluent/docker/run http://rmoff.dev/ksln19-connect-docker
  • 69. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Build a new image FROM confluentinc/cp-kafka-connect:5.2.1 ENV CONNECT_PLUGIN_PATH="/usr/share/java,/usr/share/confluent-hub-components" RUN confluent-hub install --no-prompt neo4j/kafka-connect-neo4j:1.0.0 JAR confluentinc/cp-kafka-connect-base
  • 70. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Automating connector creation # # Download JDBC drivers cd /usr/share/java/kafka-connect-jdbc/ curl https:!//cdn.mysql.com/Downloads/Connector-J/mysql-connector-java-8.0.13.tar.gz | tar xz # # Now launch Kafka Connect /etc/confluent/docker/run & # # Wait for Kafka Connect listener while [ $$(curl -s -o /dev/null -w %{http_code} http:!//$$CONNECT_REST_ADVERTISED_HOST_NAME:$… echo -e $$(date) " Kafka Connect listener HTTP state: " $$(curl -s -o /dev/null -w %{http_… sleep 5 done # # Create JDBC Source connector curl -X POST http:!//localhost:8083/connectors -H "Content-Type: application/json" -d '{ "name": "jdbc_source_mysql_00", "config": { "connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector", "connection.url": "jdbc:mysql:!//mysql:3306/demo", "connection.user": "connect_user", "connection.password": "asgard", "topic.prefix": "mysql-00-", "table.whitelist" : "demo.customers", } }' # Don't let the container die sleep infinity http://rmoff.dev/ksln19-connect-docker
  • 71. #EOF
  • 72. From Zero to Hero with Kafka Connect @rmoff #kafkasummit Confluent Community Components Apache Kafka with a bunch of cool stuff! For free! Database Changes Log Events loT Data Web Events … CRM Data Warehouse Database Hadoop Data
 Integration … Monitoring Analytics Custom Apps Transformations Real-time Applications … Confluent Platform Confluent Platform Apache Kafka® Core | Connect API | Streams API Data Compatibility Schema Registry Monitoring & Administration Confluent Control Center | Security Operations Replicator | Auto Data Balancing Development and Connectivity Clients | Connectors | REST Proxy | CLI SQL Stream Processing KSQL Datacenter Public Cloud Confluent Cloud CONFLUENT FULLY-MANAGEDCUSTOMER SELF-MANAGED
  • 73. From Zero to Hero with Kafka Connect @rmoff #kafkasummit http://cnfl.io/book-bundle
  • 74. @rmoff #kafkasummit From Zero to Hero with Kafka Connecthttp://cnfl.io/slack @rmoff #kafkasummit Please rate my session in the Kafka Summit app :-)