Presented By:
Anshika Agrawal
Sr. Software Consultant
Introduction to
Kafka Streams
Lack of etiquette and manners is a huge turn off.
KnolX Etiquettes
Punctuality
Join the session 5 minutes prior to
the session start time. We start on
time and conclude on time!
Feedback
Make sure to submit a constructive
feedback for all sessions as it is
very helpful for the presenter.
Silent Mode
Please keep your window on mute
Avoid Disturbance
Avoid leaving your window
unmuted after asking a question
Our Agenda
Introduction to Apache Kafka
A Brief Introduction to Apache Kafka
01
Kafka Core Concepts
The Components and Basic terminologies used in Kafka
Core and their relations.
02
Connect Core Concepts
What it is, how to use it and uses cases.
03
Kafka Streams, A Real Time Processing
What actually real time processing is and how Kafka
Streams works.
04
Kafka streams Core Concepts
What Kafka Streams is and it’s degrees.
05
Demo
A Short demo on how to implement Kafka Stream.
06
Introduction to Apache
Kafka
Data, Data, and Data everywhere !!!
A Real Life Scenario !
Messaging Log
Aggregation
Metrics
Stream
Processing
Website
Activity
LEARN NOW
c
What is Apache
Kafka
● Distributed Streaming Platform.
● Client library for building applications
and microservices and unbounded
data.
● Interact with the clusters to process a
stream of data.
● Data is represented in it is as key-value
records
How does it Look Like
Creating a Stream Kafka Server Processing Stream
Kafka Core Concepts
Apache Kafka Core Concepts
Cluster
Consumer
Consumer Groups
Partitions
Broker
Topic
Producer
Offset
01
08
06
07
04
02
05
03
How does they work !
Connect Core Concept
LEARN NOW
c
Kafka
Connect
● A System which can be placed in
between the data source/ Sink and
cluster.
● Just need to configure it.
● Source Connector and Sink Connectors
● Component of Kafka for connecting and
moving data between Kafka and
systems.
A View to Kafka Connect
Kafka
Connector
Kafka
Connector
Source Connector
Sink Connector
Kafka Stream : A RTP
Stream Processing in Kafka
Kafka Streams is a library
for building streaming
applications
Streams API
Applications that retrieve
data from Kafka servers
inside which Kafka
producers publish real-time
messages.
Consumer API
Offers an interactive SQL
like interface for
streaming processing.
KSQL
Streams Core Concept
LEARN NOW
c
Kafka
Streams
● Kafka Streams is a library for building
streaming applications.
● Input Data must be streamed into Kakfa
Topic.
● Parallel Processing, Scalability and Fault
tolerance etc.
● No Cluster needed i.e. Deploy anywhere.
LEARN NOW
c
Degree of
Kafka Streams
● Interoperable with streams and tables.
● Fault-tolerant, scalable and efficient
local states.
● Unit Testing Tool.
● Deploy in Containers and manage using
Kubernetes.
● Grouping and Continuously updating
aggregates.
● Flexible windowing capability.
Demo
Thank You !

Introduction to Kafka Streams - Knolx.pdf

  • 1.
    Presented By: Anshika Agrawal Sr.Software Consultant Introduction to Kafka Streams
  • 2.
    Lack of etiquetteand manners is a huge turn off. KnolX Etiquettes Punctuality Join the session 5 minutes prior to the session start time. We start on time and conclude on time! Feedback Make sure to submit a constructive feedback for all sessions as it is very helpful for the presenter. Silent Mode Please keep your window on mute Avoid Disturbance Avoid leaving your window unmuted after asking a question
  • 3.
    Our Agenda Introduction toApache Kafka A Brief Introduction to Apache Kafka 01 Kafka Core Concepts The Components and Basic terminologies used in Kafka Core and their relations. 02 Connect Core Concepts What it is, how to use it and uses cases. 03 Kafka Streams, A Real Time Processing What actually real time processing is and how Kafka Streams works. 04 Kafka streams Core Concepts What Kafka Streams is and it’s degrees. 05 Demo A Short demo on how to implement Kafka Stream. 06
  • 4.
  • 5.
    Data, Data, andData everywhere !!! A Real Life Scenario ! Messaging Log Aggregation Metrics Stream Processing Website Activity
  • 6.
    LEARN NOW c What isApache Kafka ● Distributed Streaming Platform. ● Client library for building applications and microservices and unbounded data. ● Interact with the clusters to process a stream of data. ● Data is represented in it is as key-value records
  • 7.
    How does itLook Like Creating a Stream Kafka Server Processing Stream
  • 8.
  • 9.
    Apache Kafka CoreConcepts Cluster Consumer Consumer Groups Partitions Broker Topic Producer Offset 01 08 06 07 04 02 05 03
  • 10.
  • 11.
  • 12.
    LEARN NOW c Kafka Connect ● ASystem which can be placed in between the data source/ Sink and cluster. ● Just need to configure it. ● Source Connector and Sink Connectors ● Component of Kafka for connecting and moving data between Kafka and systems.
  • 13.
    A View toKafka Connect Kafka Connector Kafka Connector Source Connector Sink Connector
  • 14.
  • 15.
    Stream Processing inKafka Kafka Streams is a library for building streaming applications Streams API Applications that retrieve data from Kafka servers inside which Kafka producers publish real-time messages. Consumer API Offers an interactive SQL like interface for streaming processing. KSQL
  • 16.
  • 17.
    LEARN NOW c Kafka Streams ● KafkaStreams is a library for building streaming applications. ● Input Data must be streamed into Kakfa Topic. ● Parallel Processing, Scalability and Fault tolerance etc. ● No Cluster needed i.e. Deploy anywhere.
  • 18.
    LEARN NOW c Degree of KafkaStreams ● Interoperable with streams and tables. ● Fault-tolerant, scalable and efficient local states. ● Unit Testing Tool. ● Deploy in Containers and manage using Kubernetes. ● Grouping and Continuously updating aggregates. ● Flexible windowing capability.
  • 19.
  • 20.