SlideShare a Scribd company logo
1 of 118
Download to read offline
1Confidential
Introducing Kafkaā€™s Streams API
Stream processing made simple
Target audience: technical staff, developers, architects
Expected duration for full deck: 45 minutes
2Confidential
0.10 Data processing (Streams API)
0.9 Data integration (Connect API)
Intra-cluster
replication
0.8
Apache Kafka: birthed as a messaging system, now a streaming platform
2012 2014 2015 2016 2017
Cluster mirroring,
data compression
0.7
2013
3Confidential
Kafkaā€™s Streams API: the easiest way to process data in Apache Kafka
Key Benefits of Apache Kafkaā€™s Streams API
ā€¢ Build Apps, Not Clusters: no additional cluster required
ā€¢ Cluster to go: elastic, scalable, distributed, fault-tolerant, secure
ā€¢ Database to go: tables, local state, interactive queries
ā€¢ Equally viable for S / M / L / XL / XXL use cases
ā€¢ ā€œRuns Everywhereā€: integrates with your existing deployment
strategies such as containers, automation, cloud
Part of open source Apache Kafka, introduced in 0.10+
ā€¢ Powerful client library to build stream processing apps
ā€¢ Apps are standard Java applications that run on client
machines
ā€¢ https://github.com/apache/kafka/tree/trunk/streams
Streams
API
Your App
Kafka
Cluster
4Confidential
Kafkaā€™s Streams API: Unix analogy
$ Ā cat Ā < Ā in.txt | grep ā€œapacheā€ Ā | tr a-Ā­ā€z Ā A-Ā­ā€Z Ā > out.txt
Kafka Ā Cluster
Connect Ā API Streams Ā API
5Confidential
Streams API in the context of Kafka
Streams
API
Your App
Kafka
Cluster
ConnectAPI
ConnectAPI
OtherSystems
OtherSystems
6Confidential
When to use Kafkaā€™s Streams API
ā€¢ Mainstream Application Development
ā€¢ To build core business applications
ā€¢ Microservices
ā€¢ Fast Data apps for small and big data
ā€¢ Reactive applications
ā€¢ Continuous queries and transformations
ā€¢ Event-triggered processes
ā€¢ The ā€œTā€ in ETL
ā€¢ <and more>
Use case examples
ā€¢ Real-time monitoring and intelligence
ā€¢ Customer 360-degree view
ā€¢ Fraud detection
ā€¢ Location-based marketing
ā€¢ Fleet management
ā€¢ <and more>
7Confidential
Some public use cases in the wild & external articles
ā€¢ Applying Kafkaā€™s Streams API for internal message delivery pipeline at LINE Corp.
ā€¢ http://developers.linecorp.com/blog/?p=3960
ā€¢ Kafka Streams in production at LINE, a social platform based in Japan with 220+ million users
ā€¢ Microservices and reactive applications at Capital One
ā€¢ https://speakerdeck.com/bobbycalderwood/commander-decoupled-immutable-rest-apis-with-kafka-streams
ā€¢ User behavior analysis
ā€¢ https://timothyrenner.github.io/engineering/2016/08/11/kafka-streams-not-looking-at-facebook.html
ā€¢ Containerized Kafka Streams applications in Scala
ā€¢ https://www.madewithtea.com/processing-tweets-with-kafka-streams.html
ā€¢ Geo-spatial data analysis
ā€¢ http://www.infolace.com/blog/2016/07/14/simple-spatial-windowing-with-kafka-streams/
ā€¢ Language classification with machine learning
ā€¢ https://dzone.com/articles/machine-learning-with-kafka-streams
8Confidential
Do more with less
9Confidential
Architecture comparison: use case example
Real-time dashboard for security monitoring
ā€œWhich of my data centers are under attack?ā€
10Confidential
Architecture comparison: use case example
Other
App
Dashboard
Frontend
App
Other
App
1 Capture business
events in Kafka
2 Must process events with
separate cluster (e.g. Spark)
4
Other apps access latest results
by querying these DBs
3 Must share latest results through
separate systems (e.g. MySQL)
Before: Undue complexity, heavy footprint, many technologies, split ownership with conflicting
priorities
Your
ā€œJobā€
Other
App
Dashboard
Frontend
App
Other
App
1 Capture business
events in Kafka
2 Process events with standard
Java apps that use Kafka Streams
3 Now other apps can directly
query the latest results
With Kafka Streams: simplified, app-centric architecture, puts app owners in control
Kafka
Streams
Your App
11Confidential
12Confidential
13Confidential
How do I install the Streams API?
ā€¢ There is and there should be no ā€œinstallationā€ ā€“ Build Apps, Not Clusters!
ā€¢ Itā€™s a library. Add it to your app like any other library.
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-Ā­ā€streams</artifactId>
<version>0.10.1.1</version>
</dependency>
14Confidential
ā€œBut wait a minute ā€“ whereā€™s THE CLUSTER to process the data?ā€
ā€¢ No cluster needed ā€“ Build Apps, Not Clusters!
ā€¢ Unlearn bad habits: ā€œdo cool stuff with data ā‰  must have clusterā€
Ok. Ok. Ok.
15Confidential
Organizational benefits: decouple teams and roadmaps, scale people
16Confidential
Organizational benefits: decouple teams and roadmaps, scale people
Infrastructure Team
(Kafka as a shared, multi-tenant service)
Fraud
detection
app
Payments team
Recommenda
tions app
Mobile team
Security
alerts
app
Operations team
...more apps...
...
17Confidential
How do I package, deploy, monitor my apps? How do I ā€¦?
ā€¢ Whatever works for you. Stick to what you/your company think is the best way.
ā€¢ No magic needed.
ā€¢ Why? Because an app that uses the Streams API isā€¦a normal Java app.
18Confidential
Available APIs
19Confidential
The API is but the tip of the iceberg
API, Ā coding
Org. Ā processes
Realityā„¢
Deployment
Operations
Security
ā€¦
Architecture
Debugging
20Confidential
ā€¢ API option 1: DSL (declarative)
KStream<Integer, Ā Integer> Ā input Ā =
builder.stream("numbers-Ā­ā€topic");
// Ā Stateless Ā computation
KStream<Integer, Ā Integer> Ā doubled Ā =
input.mapValues(v Ā -Ā­ā€> Ā v Ā * Ā 2);
// Ā Stateful Ā computation
KTable<Integer, Ā Integer> Ā sumOfOdds = Ā input
.filter((k,v) Ā -Ā­ā€> Ā v Ā % Ā 2 Ā != Ā 0)
.selectKey((k, Ā v) Ā -Ā­ā€> Ā 1)
.groupByKey()
.reduce((v1, Ā v2) Ā -Ā­ā€> Ā v1 Ā + Ā v2, Ā "sum-Ā­ā€of-Ā­ā€odds");
The preferred API for most use cases.
Particularly appeals to:
ā€¢ Fans of Scala, functional programming
ā€¢ Users familiar with e.g. Spark
21Confidential
ā€¢ API option 2: Processor API (imperative)
class Ā PrintToConsoleProcessor
implements Ā Processor<K, Ā V> Ā {
@Override
public Ā void Ā init(ProcessorContext context) Ā {}
@Override
void Ā process(K Ā key, Ā V Ā value) Ā { Ā 
System.out.println("Got Ā value Ā " Ā + Ā value); Ā 
}
@Override
void Ā punctuate(long Ā timestamp) Ā {}
@Override
void Ā close() Ā {}
}
Full flexibility but more manual work
Appeals to:
ā€¢ Users who require functionality that is
not yet available in the DSL
ā€¢ Users familiar with e.g. Storm, Samza
ā€¢ Still, check out the DSL!
22Confidential
When to use Kafka Streams vs. Kafkaā€™s ā€œnormalā€ consumer clients
Kafka Streams
ā€¢ Basically all the time
ā€¢ Basically all the time
ā€¢ Basically all the time
ā€¢ Basically all the time
ā€¢ Basically all the time
ā€¢ Basically all the time
ā€¢ Basically all the time
ā€¢ Basically all the time
ā€¢ Basically all the time
ā€¢ Basically all the time
ā€¢ Basically all the time
Kafka consumer clients (Java, C/C++, Python, Go, ā€¦)
ā€¢ When you must interact with Kafka at a very low
level and/or in a very special way
ā€¢ Example: When integrating your own stream
processing tool (Spark, Storm) with Kafka.
23Confidential
Code comparison
Featuring Kafka with Streams API <-> Spark Streaming
24Confidential
ā€My WordCount is better than your WordCountā€ (?)
Kafka
Spark
These isolated code snippets are nice (and actually quite similar) but they are not very meaningful. In practice, we
also need to read data from somewhere, write data back to somewhere, etc.ā€“ but we can see none of this here.
25Confidential
WordCount in Kafka
Word
Count
26Confidential
Compared to: WordCount in Spark 2.0
1
2
3
Runtime model leaks into
processing logic
(here: interfacing from
Spark with Kafka)
27Confidential
Compared to: WordCount in Spark 2.0
4
5
Runtime model leaks into
processing logic
(driver vs. executors)
28Confidential
Key concepts
29Confidential
Key concepts
30Confidential
Key concepts
31Confidential
Key concepts
Kafka Core Kafka Streams
32Confidential
Streams and Tables
Stream Processing meets Databases
33Confidential
34Confidential
35Confidential
Key observation: close relationship between Streams and Tables
http://www.confluent.io/blog/introducing-kafka-streams-stream-processing-made-simple
http://docs.confluent.io/current/streams/concepts.html#duality-of-streams-and-tables
36Confidential
37Confidential
Example: Streams and Tables in Kafka
Word Count
hello 2
kafka 1
world 1
ā€¦ ā€¦
38Confidential
39Confidential
40Confidential
41Confidential
42Confidential
Example: continuously compute current users per geo-region
4
7
5
3
2
8 4
7
6
3
2
7
Alice
Real-time dashboard
ā€œHow many users younger than 30y, per region?ā€
alice Europe
user-locations
alice Asia, 25y, ā€¦
bob Europe, 46y, ā€¦
ā€¦ ā€¦
alice Europe, 25y, ā€¦
bob Europe, 46y, ā€¦
ā€¦ ā€¦
-1
+1
user-locations
(mobile team)
user-prefs
(web team)
43Confidential
Example: continuously compute current users per geo-region
KTable<UserId, Ā Location> Ā userLocations = Ā builder.table(ā€œuser-Ā­ā€locations-Ā­ā€topicā€);
KTable<UserId, Ā Prefs> Ā  Ā  Ā  Ā userPrefs = Ā builder.table(ā€œuser-Ā­ā€preferences-Ā­ā€topicā€);
44Confidential
Example: continuously compute current users per geo-region
alice Europe
user-locations
alice Asia, 25y, ā€¦
bob Europe, 46y, ā€¦
ā€¦ ā€¦
alice Europe, 25y, ā€¦
bob Europe, 46y, ā€¦
ā€¦ ā€¦
KTable<UserId, Ā Location> Ā userLocations = Ā builder.table(ā€œuser-Ā­ā€locations-Ā­ā€topicā€);
KTable<UserId, Ā Prefs> Ā  Ā  Ā  Ā userPrefs = Ā builder.table(ā€œuser-Ā­ā€preferences-Ā­ā€topicā€);
// Ā Merge Ā into Ā detailed Ā user Ā profiles Ā (continuously Ā updated)
KTable<UserId, Ā UserProfile> Ā userProfiles =
userLocations.join(userPrefs, Ā (loc, Ā prefs) Ā -Ā­ā€> Ā new Ā UserProfile(loc, Ā prefs));
KTable userProfilesKTable userProfiles
45Confidential
Example: continuously compute current users per geo-region
KTable<UserId, Ā Location> Ā userLocations = Ā builder.table(ā€œuser-Ā­ā€locations-Ā­ā€topicā€);
KTable<UserId, Ā Prefs> Ā  Ā  Ā  Ā userPrefs = Ā builder.table(ā€œuser-Ā­ā€preferences-Ā­ā€topicā€);
// Ā Merge Ā into Ā detailed Ā user Ā profiles Ā (continuously Ā updated)
KTable<UserId, Ā UserProfile> Ā userProfiles =
userLocations.join(userPrefs, Ā (loc, Ā prefs) Ā -Ā­ā€> Ā new Ā UserProfile(loc, Ā prefs));
// Ā Compute Ā per-Ā­ā€region Ā statistics Ā (continuously Ā updated)
KTable<UserId, Ā Long> Ā usersPerRegion = Ā userProfiles
.filter((userId, Ā profile) Ā  Ā -Ā­ā€> Ā profile.age < Ā 30)
.groupBy((userId, Ā profile) Ā -Ā­ā€> Ā profile.location)
.count();
alice Europe
user-locations
Africa 3
ā€¦ ā€¦
Asia 8
Europe 5
Africa 3
ā€¦ ā€¦
Asia 7
Europe 6
KTable usersPerRegion KTable usersPerRegion
46Confidential
Example: continuously compute current users per geo-region
4
7
5
3
2
8 4
7
6
3
2
7
Alice
Real-time dashboard
ā€œHow many users younger than 30y, per region?ā€
alice Europe
user-locations
alice Asia, 25y, ā€¦
bob Europe, 46y, ā€¦
ā€¦ ā€¦
alice Europe, 25y, ā€¦
bob Europe, 46y, ā€¦
ā€¦ ā€¦
-1
+1
user-locations
(mobile team)
user-prefs
(web team)
47Confidential
Streams meet Tables ā€“ in the DSL
48Confidential
Streams meet Tables
ā€¢ Most use cases for stream processing require both Streams and Tables
ā€¢ Essential for any stateful computations
ā€¢ Kafka ships with first-class support for Streams and Tables
ā€¢ Scalability, fault tolerance, efficient joins and aggregations, ā€¦
ā€¢ Benefits include: simplified architectures, less moving pieces, less Do-It-Yourself work
49Confidential
Key features
50Confidential
Key features in 0.10
ā€¢ Native, 100%-compatible Kafka integration
51Confidential
Native, 100% compatible Kafka integration
Read Ā from Ā Kafka
Write Ā to Ā Kafka
52Confidential
Key features in 0.10
ā€¢ Native, 100%-compatible Kafka integration
ā€¢ Secure stream processing using Kafkaā€™s security features
53Confidential
Secure stream processing with the Streams API
ā€¢ Your applications can leverage all client-side security features in Apache Kafka
ā€¢ Security features include:
ā€¢ Encrypting data-in-transit between applications and Kafka clusters
ā€¢ Authenticating applications against Kafka clusters (ā€œonly some apps may talk to the production
clusterā€)
ā€¢ Authorizing application against Kafka clusters (ā€œonly some apps may read data from sensitive topicsā€)
54Confidential
Configuring security settings
ā€¢ In general, you can configure both Kafka Streams plus the underlying Kafka clients in your
apps
55Confidential
Configuring security settings
ā€¢ Example: encrypting data-in-transit + client authentication to Kafka cluster
Full demo application at https://github.com/confluentinc/examples
56Confidential
Key features in 0.10
ā€¢ Native, 100%-compatible Kafka integration
ā€¢ Secure stream processing using Kafkaā€™s security features
ā€¢ Elastic and highly scalable
ā€¢ Fault-tolerant
57Confidential
58Confidential
59Confidential
60Confidential
Key features in 0.10
ā€¢ Native, 100%-compatible Kafka integration
ā€¢ Secure stream processing using Kafkaā€™s security features
ā€¢ Elastic and highly scalable
ā€¢ Fault-tolerant
ā€¢ Stateful and stateless computations
61Confidential
Stateful computations
ā€¢ Stateful computations like aggregations (e.g. counting), joins, or windowing require state
ā€¢ State stores are the backbone of state management
ā€¢ ā€¦ are local for best performance
ā€¢ ā€¦ are backed up to Kafka for elasticity and for fault-tolerance
ā€¢ ... are per stream task for isolation ā€“ think: share-nothing
ā€¢ Pluggable storage engines
ā€¢ Default: RocksDB (a key-value store) to allow for local state that is larger than available RAM
ā€¢ You can also use your own, custom storage engine
ā€¢ From the user perspective:
ā€¢ DSL: no need to worry about anything, state management is automatically being done for you
ā€¢ Processor API: direct access to state stores ā€“ very flexible but more manual work
62Confidential
63Confidential
64Confidential
65Confidential
66Confidential
Use case: real-time, distributed joins at large scale
67Confidential
Use case: real-time, distributed joins at large scale
68Confidential
Use case: real-time, distributed joins at large scale
69Confidential
Stateful computations
ā€¢ Use the Processor API to interact directly with state stores
Get Ā the Ā store
Use Ā the Ā store
70Confidential
Key features in 0.10
ā€¢ Native, 100%-compatible Kafka integration
ā€¢ Secure stream processing using Kafkaā€™s security features
ā€¢ Elastic and highly scalable
ā€¢ Fault-tolerant
ā€¢ Stateful and stateless computations
ā€¢ Interactive queries
71Confidential
72Confidential
Interactive Queries: architecture comparison
Kafka
Streams
App
App
App
App
1 Capture business
events in Kafka
2 Process the events
with Kafka Streams
4
Other apps query external
systems for latest results
! Must use external systems
to share latest results
App
App
App
1 Capture business
events in Kafka
2 Process the events
with Kafka Streams
3 Now other apps can directly
query the latest results
Before (0.10.0)
After (0.10.1): simplified, more app-centric architecture
Kafka
Streams
App
73Confidential
Key features in 0.10
ā€¢ Native, 100%-compatible Kafka integration
ā€¢ Secure stream processing using Kafkaā€™s security features
ā€¢ Elastic and highly scalable
ā€¢ Fault-tolerant
ā€¢ Stateful and stateless computations
ā€¢ Interactive queries
ā€¢ Time model
74Confidential
Time
75Confidential
Time
A
C
B
76Confidential
Time
ā€¢ You configure the desired time semantics through timestamp extractors
ā€¢ Default extractor yields event-time semantics
ā€¢ Extracts embedded timestamps of Kafka messages (introduced in v0.10)
77Confidential
Key features in 0.10
ā€¢ Native, 100%-compatible Kafka integration
ā€¢ Secure stream processing using Kafkaā€™s security features
ā€¢ Elastic and highly scalable
ā€¢ Fault-tolerant
ā€¢ Stateful and stateless computations
ā€¢ Interactive queries
ā€¢ Time model
ā€¢ Windowing
78Confidential
Windowing
ā€¢ Group events in a stream using time-based windows
ā€¢ Use case examples:
ā€¢ Time-based analysis of ad impressions (ā€number of ads clicked in the past hourā€)
ā€¢ Monitoring statistics of telemetry data (ā€œ1min/5min/15min averagesā€)
Input data, where
colors represent
different users events
Rectangles denote
different event-time
windows
processing-time
event-time
windowing
alice
bob
dave
79Confidential
Windowing in the DSL
TimeWindows.of(3000)
TimeWindows.of(3000).advanceBy(1000)
80Confidential
Key features in 0.10
ā€¢ Native, 100%-compatible Kafka integration
ā€¢ Secure stream processing using Kafkaā€™s security features
ā€¢ Elastic and highly scalable
ā€¢ Fault-tolerant
ā€¢ Stateful and stateless computations
ā€¢ Interactive queries
ā€¢ Time model
ā€¢ Windowing
ā€¢ Supports late-arriving and out-of-order data
81Confidential
Out-of-order and late-arriving data
ā€¢ Is very common in practice, not a rare corner case
ā€¢ Related to time model discussion
82Confidential
Out-of-order and late-arriving data: example when this will happen
Users with mobile phones enter
airplane, lose Internet connectivity
Emails are being written
during the 10h flight
Internet connectivity is restored,
phones will send queued emails now
83Confidential
Out-of-order and late-arriving data
ā€¢ Is very common in practice, not a rare corner case
ā€¢ Related to time model discussion
ā€¢ We want control over how out-of-order data is handled, and handling must be efficient
ā€¢ Example: We process data in 5-minute windows, e.g. compute statistics
ā€¢ Option A: When event arrives 1 minute late: update the original result!
ā€¢ Option B: When event arrives 2 hours late: discard it!
84Confidential
Key features in 0.10
ā€¢ Native, 100%-compatible Kafka integration
ā€¢ Secure stream processing using Kafkaā€™s security features
ā€¢ Elastic and highly scalable
ā€¢ Fault-tolerant
ā€¢ Stateful and stateless computations
ā€¢ Interactive queries
ā€¢ Time model
ā€¢ Windowing
ā€¢ Supports late-arriving and out-of-order data
ā€¢ Millisecond processing latency, no micro-batching
ā€¢ At-least-once processing guarantees (exactly-once is in the works as we speak)
85Confidential
Roadmap Outlook
86Confidential
Roadmap outlook for Kafka Streams
ā€¢ Exactly-Once processing semantics
ā€¢ Unified API for real-time processing and ā€œbatchā€ processing
ā€¢ Global KTables
ā€¢ Session windows
ā€¢ ā€¦ and more ā€¦
87Confidential
Wrapping Up
88Confidential
Where to go from here
ā€¢ Kafka Streams is available in Confluent Platform 3.1 and in Apache Kafka 0.10.1
ā€¢ http://www.confluent.io/download
ā€¢ Kafka Streams demos: https://github.com/confluentinc/examples
ā€¢ Java 7, Java 8+ with lambdas, and Scala
ā€¢ WordCount, Interactive Queries, Joins, Security, Windowing, Avro integration, ā€¦
ā€¢ Confluent documentation: http://docs.confluent.io/current/streams/
ā€¢ Quickstart, Concepts, Architecture, Developer Guide, FAQ
ā€¢ Recorded talks
ā€¢ Introduction to Kafka Streams:
http://www.youtube.com/watch?v=o7zSLNiTZbA
ā€¢ Application Development and Data in the Emerging World of Stream Processing (higher level talk):
https://www.youtube.com/watch?v=JQnNHO5506w
89Confidential
Thank You
90Confidential
Appendix: Streams and Tables
A closer look
91Confidential
Motivating example: continuously compute current users per geo-region
4
7
5
3
2
8
Real-time dashboard
ā€œHow many users younger than 30y, per region?ā€
alice Asia, 25y, ā€¦
bob Europe, 46y, ā€¦
ā€¦ ā€¦
user-locations
(mobile team)
user-prefs
(web team)
92Confidential
Motivating example: continuously compute current users per geo-region
4
7
5
3
2
8
Real-time dashboard
ā€œHow many users younger than 30y, per region?ā€
alice Europe
user-locations
alice Asia, 25y, ā€¦
bob Europe, 46y, ā€¦
ā€¦ ā€¦
user-locations
(mobile team)
user-prefs
(web team)
93Confidential
Motivating example: continuously compute current users per geo-region
4
7
5
3
2
8
Real-time dashboard
ā€œHow many users younger than 30y, per region?ā€
alice Europe
user-locations
user-locations
(mobile team)
user-prefs
(web team)
alice Asia, 25y, ā€¦
bob Europe, 46y, ā€¦
ā€¦ ā€¦
alice Europe, 25y, ā€¦
bob Europe, 46y, ā€¦
ā€¦ ā€¦
94Confidential
Motivating example: continuously compute current users per geo-region
4
7
5
3
2
8 4
7
6
3
2
7
Alice
Real-time dashboard
ā€œHow many users younger than 30y, per region?ā€
alice Europe
user-locations
alice Asia, 25y, ā€¦
bob Europe, 46y, ā€¦
ā€¦ ā€¦
alice Europe, 25y, ā€¦
bob Europe, 46y, ā€¦
ā€¦ ā€¦
-1
+1
user-locations
(mobile team)
user-prefs
(web team)
95Confidential
Same data, but different use cases require different interpretations
alice San Francisco
alice New York City
alice Rio de Janeiro
alice Sydney
alice Beijing
alice Paris
alice Berlin
96Confidential
Same data, but different use cases require different interpretations
alice San Francisco
alice New York City
alice Rio de Janeiro
alice Sydney
alice Beijing
alice Paris
alice Berlin
Use Ā case Ā 1: Ā Frequent Ā traveler Ā status?
Use Ā case Ā 2: Ā Current Ā location?
97Confidential
Same data, but different use cases require different interpretations
ā€œAlice has been to SFO, NYC, Rio, Sydney,
Beijing, Paris, and finally Berlin.ā€
ā€œAlice is in SFO, NYC, Rio, Sydney,
Beijing, Paris, Berlin right now.ā€
āš‘ āš‘
āš‘āš‘
āš‘
āš‘
āš‘ āš‘ āš‘
āš‘āš‘
āš‘
āš‘
āš‘
Use Ā case Ā 1: Ā Frequent Ā traveler Ā status? Use Ā case Ā 2: Ā Current Ā location?
98Confidential
Same data, but different use cases require different interpretations
alice San Francisco
alice New York City
alice Rio de Janeiro
alice Sydney
alice Beijing
alice Paris
alice Berlin
Use Ā case Ā 1: Ā Frequent Ā traveler Ā status?
Use Ā case Ā 2: Ā Current Ā location?
āš‘ āš‘ āš‘āš‘
āš‘
āš‘āš‘
āš‘
99Confidential
Same data, but different use cases require different interpretations
alice San Francisco
alice New York City
alice Rio de Janeiro
alice Sydney
alice Beijing
alice Paris
alice Berlin
Use Ā case Ā 1: Ā Frequent Ā traveler Ā status?
Use Ā case Ā 2: Ā Current Ā location?
āš‘ āš‘ āš‘āš‘
āš‘
āš‘āš‘
āš‘
100Confidential
Same data, but different use cases require different interpretations
alice San Francisco
alice New York City
alice Rio de Janeiro
alice Sydney
alice Beijing
alice Paris
alice Berlin
Use Ā case Ā 1: Ā Frequent Ā traveler Ā status?
Use Ā case Ā 2: Ā Current Ā location?
āš‘ āš‘ āš‘āš‘
āš‘
āš‘āš‘
āš‘
101Confidential
Streams meet Tables
record stream
When you needā€¦ so that the topic is
interpreted as a
All the values of a key KStream
then youā€™d read the
Kafka topic into a
Example
All the places Alice
has ever been to
with messages
interpreted as
INSERT
(append)
102Confidential
Streams meet Tables
record stream
changelog stream
When you needā€¦ so that the topic is
interpreted as a
All the values of a key
Latest value of a key
KStream
KTable
then youā€™d read the
Kafka topic into a
Example
All the places Alice
has ever been to
Where Alice
is right now
with messages
interpreted as
INSERT
(append)
UPSERT
(overwrite
existing)
103Confidential
Same data, but different use cases require different interpretations
ā€œAlice has been to SFO, NYC, Rio, Sydney,
Beijing, Paris, and finally Berlin.ā€
ā€œAlice is in SFO, NYC, Rio, Sydney,
Beijing, Paris, Berlin right now.ā€
āš‘ āš‘
āš‘āš‘
āš‘
āš‘
āš‘ āš‘ āš‘
āš‘āš‘
āš‘
āš‘
āš‘
Use Ā case Ā 1: Ā Frequent Ā traveler Ā status? Use Ā case Ā 2: Ā Current Ā location?
KStream KTable
104Confidential
Motivating example: continuously compute current users per geo-region
4
7
5
3
2
8 4
7
6
3
2
7
Alice
Real-time dashboard
ā€œHow many users younger than 30y, per region?ā€
alice Europe
user-locations
alice Asia, 25y, ā€¦
bob Europe, 46y, ā€¦
ā€¦ ā€¦
alice Europe, 25y, ā€¦
bob Europe, 46y, ā€¦
ā€¦ ā€¦
-1
+1
user-locations
(mobile team)
user-prefs
(web team)
105Confidential
Motivating example: continuously compute current users per geo-region
KTable<UserId, Ā Location> Ā userLocations = Ā builder.table(ā€œuser-Ā­ā€locations-Ā­ā€topicā€);
KTable<UserId, Ā Prefs> Ā  Ā  Ā  Ā userPrefs = Ā builder.table(ā€œuser-Ā­ā€preferences-Ā­ā€topicā€);
106Confidential
Motivating example: continuously compute current users per geo-region
alice Europe
user-locations
alice Asia, 25y, ā€¦
bob Europe, 46y, ā€¦
ā€¦ ā€¦
alice Europe, 25y, ā€¦
bob Europe, 46y, ā€¦
ā€¦ ā€¦
KTable<UserId, Ā Location> Ā userLocations = Ā builder.table(ā€œuser-Ā­ā€locations-Ā­ā€topicā€);
KTable<UserId, Ā Prefs> Ā  Ā  Ā  Ā userPrefs = Ā builder.table(ā€œuser-Ā­ā€preferences-Ā­ā€topicā€);
// Ā Merge Ā into Ā detailed Ā user Ā profiles Ā (continuously Ā updated)
KTable<UserId, Ā UserProfile> Ā userProfiles =
userLocations.join(userPrefs, Ā (loc, Ā prefs) Ā -Ā­ā€> Ā new Ā UserProfile(loc, Ā prefs));
KTable userProfilesKTable userProfiles
107Confidential
Motivating example: continuously compute current users per geo-region
KTable<UserId, Ā Location> Ā userLocations = Ā builder.table(ā€œuser-Ā­ā€locations-Ā­ā€topicā€);
KTable<UserId, Ā Prefs> Ā  Ā  Ā  Ā userPrefs = Ā builder.table(ā€œuser-Ā­ā€preferences-Ā­ā€topicā€);
// Ā Merge Ā into Ā detailed Ā user Ā profiles Ā (continuously Ā updated)
KTable<UserId, Ā UserProfile> Ā userProfiles =
userLocations.join(userPrefs, Ā (loc, Ā prefs) Ā -Ā­ā€> Ā new Ā UserProfile(loc, Ā prefs));
// Ā Compute Ā per-Ā­ā€region Ā statistics Ā (continuously Ā updated)
KTable<UserId, Ā Long> Ā usersPerRegion = Ā userProfiles
.filter((userId, Ā profile) Ā  Ā -Ā­ā€> Ā profile.age < Ā 30)
.groupBy((userId, Ā profile) Ā -Ā­ā€> Ā profile.location)
.count();
alice Europe
user-locations
Africa 3
ā€¦ ā€¦
Asia 8
Europe 5
Africa 3
ā€¦ ā€¦
Asia 7
Europe 6
KTable usersPerRegion KTable usersPerRegion
108Confidential
Motivating example: continuously compute current users per geo-region
4
7
5
3
2
8 4
7
6
3
2
7
Alice
Real-time dashboard
ā€œHow many users younger than 30y, per region?ā€
alice Europe
user-locations
alice Asia, 25y, ā€¦
bob Europe, 46y, ā€¦
ā€¦ ā€¦
alice Europe, 25y, ā€¦
bob Europe, 46y, ā€¦
ā€¦ ā€¦
-1
+1
user-locations
(mobile team)
user-prefs
(web team)
109Confidential
Another common use case: continuous transformations
ā€¢ Example: to enrich an input stream (user clicks) with side data (current user profile)
KStream alice /rental/p8454vb, 06:59 PM PDT
user-clicks-topics (at 1M msgs/s)
ā€œfactsā€
110Confidential
Another common use case: continuous transformations
ā€¢ Example: to enrich an input stream (user clicks) with side data (current user profile)
KStream alice /rental/p8454vb, 06:59 PM PDT
alice Asia, 25y
bob Europe, 46y
ā€¦ ā€¦
KTable
user-profiles-topic
user-clicks-topics (at 1M msgs/s)
ā€œfactsā€
ā€œdimensionsā€
111Confidential
Another common use case: continuous transformations
ā€¢ Example: to enrich an input stream (user clicks) with side data (current user profile)
KStream
alice /rental/p8454vb, 06:59 PDT, Asia, 25y
stream.JOIN(table)
alice /rental/p8454vb, 06:59 PM PDT
alice Asia, 25y
bob Europe, 46y
ā€¦ ā€¦
KTable
user-profiles-topic
user-clicks-topics (at 1M msgs/s)
ā€œfactsā€
ā€œdimensionsā€
112Confidential
Another common use case: continuous transformations
ā€¢ Example: to enrich an input stream (user clicks) with side data (current user profile)
KStream
alice /rental/p8454vb, 06:59 PDT, Asia, 25y
stream.JOIN(table)
alice /rental/p8454vb, 06:59 PM PDT
alice Asia, 25y
bob Europe, 46y
ā€¦ ā€¦
KTable
alice Europe, 25y
bob Europe, 46y
ā€¦ ā€¦alice Europe
new update for alice from user-locations topic
user-profiles-topic
user-clicks-topics (at 1M msgs/s)
ā€œfactsā€
ā€œdimensionsā€
113Confidential
Appendix: Interactive Queries
A closer look
114Confidential
Interactive Queries
115Confidential
Interactive Queries
charlie 3bob 5 alice 2
116Confidential
Interactive Queries
New Ā API Ā to Ā access
local Ā state Ā stores Ā of
an Ā app Ā instance
charlie 3bob 5 alice 2
117Confidential
Interactive Queries
New Ā API Ā to Ā discover
running Ā app Ā instances
charlie 3bob 5 alice 2
ā€œhost1:4460ā€ ā€œhost5:5307ā€ ā€œhost3:4777ā€
118Confidential
Interactive Queries
You: Ā inter-Ā­app Ā communication Ā (RPC Ā layer)

More Related Content

What's hot

What's hot (20)

Scaling your Data Pipelines with Apache Spark on Kubernetes
Scaling your Data Pipelines with Apache Spark on KubernetesScaling your Data Pipelines with Apache Spark on Kubernetes
Scaling your Data Pipelines with Apache Spark on Kubernetes
Ā 
Hello, kafka! (an introduction to apache kafka)
Hello, kafka! (an introduction to apache kafka)Hello, kafka! (an introduction to apache kafka)
Hello, kafka! (an introduction to apache kafka)
Ā 
Fundamentals of Apache Kafka
Fundamentals of Apache KafkaFundamentals of Apache Kafka
Fundamentals of Apache Kafka
Ā 
Deploying Flink on Kubernetes - David Anderson
 Deploying Flink on Kubernetes - David Anderson Deploying Flink on Kubernetes - David Anderson
Deploying Flink on Kubernetes - David Anderson
Ā 
Kafka 101
Kafka 101Kafka 101
Kafka 101
Ā 
Stream Processing with Apache Kafka and .NET
Stream Processing with Apache Kafka and .NETStream Processing with Apache Kafka and .NET
Stream Processing with Apache Kafka and .NET
Ā 
ksqlDB: A Stream-Relational Database System
ksqlDB: A Stream-Relational Database SystemksqlDB: A Stream-Relational Database System
ksqlDB: A Stream-Relational Database System
Ā 
Autoscaling Flink with Reactive Mode
Autoscaling Flink with Reactive ModeAutoscaling Flink with Reactive Mode
Autoscaling Flink with Reactive Mode
Ā 
Apache Kafka Fundamentals for Architects, Admins and Developers
Apache Kafka Fundamentals for Architects, Admins and DevelopersApache Kafka Fundamentals for Architects, Admins and Developers
Apache Kafka Fundamentals for Architects, Admins and Developers
Ā 
ksqlDB - Stream Processing simplified!
ksqlDB - Stream Processing simplified!ksqlDB - Stream Processing simplified!
ksqlDB - Stream Processing simplified!
Ā 
Introduction to Kafka connect
Introduction to Kafka connectIntroduction to Kafka connect
Introduction to Kafka connect
Ā 
Apache Spark Streaming in K8s with ArgoCD & Spark Operator
Apache Spark Streaming in K8s with ArgoCD & Spark OperatorApache Spark Streaming in K8s with ArgoCD & Spark Operator
Apache Spark Streaming in K8s with ArgoCD & Spark Operator
Ā 
Apache Flink, AWS Kinesis, Analytics
Apache Flink, AWS Kinesis, Analytics Apache Flink, AWS Kinesis, Analytics
Apache Flink, AWS Kinesis, Analytics
Ā 
Real-Life Use Cases & Architectures for Event Streaming with Apache Kafka
Real-Life Use Cases & Architectures for Event Streaming with Apache KafkaReal-Life Use Cases & Architectures for Event Streaming with Apache Kafka
Real-Life Use Cases & Architectures for Event Streaming with Apache Kafka
Ā 
Stream processing using Kafka
Stream processing using KafkaStream processing using Kafka
Stream processing using Kafka
Ā 
Dynamically Scaling Data Streams across Multiple Kafka Clusters with Zero Fli...
Dynamically Scaling Data Streams across Multiple Kafka Clusters with Zero Fli...Dynamically Scaling Data Streams across Multiple Kafka Clusters with Zero Fli...
Dynamically Scaling Data Streams across Multiple Kafka Clusters with Zero Fli...
Ā 
Apache Kafka Best Practices
Apache Kafka Best PracticesApache Kafka Best Practices
Apache Kafka Best Practices
Ā 
Apache Kafka Introduction
Apache Kafka IntroductionApache Kafka Introduction
Apache Kafka Introduction
Ā 
Apache Kafka - Martin Podval
Apache Kafka - Martin PodvalApache Kafka - Martin Podval
Apache Kafka - Martin Podval
Ā 
Producer Performance Tuning for Apache Kafka
Producer Performance Tuning for Apache KafkaProducer Performance Tuning for Apache Kafka
Producer Performance Tuning for Apache Kafka
Ā 

Viewers also liked

Viewers also liked (20)

The Data Dichotomy- Rethinking the Way We Treat Data and Services
The Data Dichotomy- Rethinking the Way We Treat Data and ServicesThe Data Dichotomy- Rethinking the Way We Treat Data and Services
The Data Dichotomy- Rethinking the Way We Treat Data and Services
Ā 
Apache kafka-a distributed streaming platform
Apache kafka-a distributed streaming platformApache kafka-a distributed streaming platform
Apache kafka-a distributed streaming platform
Ā 
Data integration with Apache Kafka
Data integration with Apache KafkaData integration with Apache Kafka
Data integration with Apache Kafka
Ā 
Streaming in Practice - Putting Apache Kafka in Production
Streaming in Practice - Putting Apache Kafka in ProductionStreaming in Practice - Putting Apache Kafka in Production
Streaming in Practice - Putting Apache Kafka in Production
Ā 
Data Streaming with Apache Kafka & MongoDB
Data Streaming with Apache Kafka & MongoDBData Streaming with Apache Kafka & MongoDB
Data Streaming with Apache Kafka & MongoDB
Ā 
Deep Dive into Apache Kafka
Deep Dive into Apache KafkaDeep Dive into Apache Kafka
Deep Dive into Apache Kafka
Ā 
What's new in Confluent 3.2 and Apache Kafka 0.10.2
What's new in Confluent 3.2 and Apache Kafka 0.10.2 What's new in Confluent 3.2 and Apache Kafka 0.10.2
What's new in Confluent 3.2 and Apache Kafka 0.10.2
Ā 
A Practical Guide to Selecting a Stream Processing Technology
A Practical Guide to Selecting a Stream Processing Technology A Practical Guide to Selecting a Stream Processing Technology
A Practical Guide to Selecting a Stream Processing Technology
Ā 
Power of the Log: LSM & Append Only Data Structures
Power of the Log: LSM & Append Only Data StructuresPower of the Log: LSM & Append Only Data Structures
Power of the Log: LSM & Append Only Data Structures
Ā 
Demystifying Stream Processing with Apache Kafka
Demystifying Stream Processing with Apache KafkaDemystifying Stream Processing with Apache Kafka
Demystifying Stream Processing with Apache Kafka
Ā 
AMIS SIG - Introducing Apache Kafka - Scalable, reliable Event Bus & Message ...
AMIS SIG - Introducing Apache Kafka - Scalable, reliable Event Bus & Message ...AMIS SIG - Introducing Apache Kafka - Scalable, reliable Event Bus & Message ...
AMIS SIG - Introducing Apache Kafka - Scalable, reliable Event Bus & Message ...
Ā 
Leveraging Mainframe Data for Modern Analytics
Leveraging Mainframe Data for Modern AnalyticsLeveraging Mainframe Data for Modern Analytics
Leveraging Mainframe Data for Modern Analytics
Ā 
Monitoring Apache Kafka with Confluent Control Center
Monitoring Apache Kafka with Confluent Control Center   Monitoring Apache Kafka with Confluent Control Center
Monitoring Apache Kafka with Confluent Control Center
Ā 
Strata+Hadoop 2017 San Jose: Lessons from a year of supporting Apache Kafka
Strata+Hadoop 2017 San Jose: Lessons from a year of supporting Apache KafkaStrata+Hadoop 2017 San Jose: Lessons from a year of supporting Apache Kafka
Strata+Hadoop 2017 San Jose: Lessons from a year of supporting Apache Kafka
Ā 
Strata+Hadoop 2017 San Jose - The Rise of Real Time: Apache Kafka and the Str...
Strata+Hadoop 2017 San Jose - The Rise of Real Time: Apache Kafka and the Str...Strata+Hadoop 2017 San Jose - The Rise of Real Time: Apache Kafka and the Str...
Strata+Hadoop 2017 San Jose - The Rise of Real Time: Apache Kafka and the Str...
Ā 
Introduction To Streaming Data and Stream Processing with Apache Kafka
Introduction To Streaming Data and Stream Processing with Apache KafkaIntroduction To Streaming Data and Stream Processing with Apache Kafka
Introduction To Streaming Data and Stream Processing with Apache Kafka
Ā 
Distributed stream processing with Apache Kafka
Distributed stream processing with Apache KafkaDistributed stream processing with Apache Kafka
Distributed stream processing with Apache Kafka
Ā 
Data Pipelines Made Simple with Apache Kafka
Data Pipelines Made Simple with Apache KafkaData Pipelines Made Simple with Apache Kafka
Data Pipelines Made Simple with Apache Kafka
Ā 
What is the Oracle PaaS Cloud for Developers (Oracle Cloud Day, The Netherlan...
What is the Oracle PaaS Cloud for Developers (Oracle Cloud Day, The Netherlan...What is the Oracle PaaS Cloud for Developers (Oracle Cloud Day, The Netherlan...
What is the Oracle PaaS Cloud for Developers (Oracle Cloud Day, The Netherlan...
Ā 
Oracle OpenWorld 2016 Review - Focus on Data, BigData, Streaming Data, Machin...
Oracle OpenWorld 2016 Review - Focus on Data, BigData, Streaming Data, Machin...Oracle OpenWorld 2016 Review - Focus on Data, BigData, Streaming Data, Machin...
Oracle OpenWorld 2016 Review - Focus on Data, BigData, Streaming Data, Machin...
Ā 

Similar to Introducing Kafka's Streams API

What's New in AWS Serverless and Containers
What's New in AWS Serverless and ContainersWhat's New in AWS Serverless and Containers
What's New in AWS Serverless and Containers
Amazon Web Services
Ā 

Similar to Introducing Kafka's Streams API (20)

Kafka Streams: The Stream Processing Engine of Apache Kafka
Kafka Streams: The Stream Processing Engine of Apache KafkaKafka Streams: The Stream Processing Engine of Apache Kafka
Kafka Streams: The Stream Processing Engine of Apache Kafka
Ā 
Kafka Streams for Java enthusiasts
Kafka Streams for Java enthusiastsKafka Streams for Java enthusiasts
Kafka Streams for Java enthusiasts
Ā 
Rethinking Stream Processing with Apache Kafka, Kafka Streams and KSQL
Rethinking Stream Processing with Apache Kafka, Kafka Streams and KSQLRethinking Stream Processing with Apache Kafka, Kafka Streams and KSQL
Rethinking Stream Processing with Apache Kafka, Kafka Streams and KSQL
Ā 
Data Pipelines with Kafka Connect
Data Pipelines with Kafka ConnectData Pipelines with Kafka Connect
Data Pipelines with Kafka Connect
Ā 
Building streaming data applications using Kafka*[Connect + Core + Streams] b...
Building streaming data applications using Kafka*[Connect + Core + Streams] b...Building streaming data applications using Kafka*[Connect + Core + Streams] b...
Building streaming data applications using Kafka*[Connect + Core + Streams] b...
Ā 
Building Streaming Data Applications Using Apache Kafka
Building Streaming Data Applications Using Apache KafkaBuilding Streaming Data Applications Using Apache Kafka
Building Streaming Data Applications Using Apache Kafka
Ā 
Apache Kafka vs. Traditional Middleware (Kai Waehner, Confluent) Frankfurt 20...
Apache Kafka vs. Traditional Middleware (Kai Waehner, Confluent) Frankfurt 20...Apache Kafka vs. Traditional Middleware (Kai Waehner, Confluent) Frankfurt 20...
Apache Kafka vs. Traditional Middleware (Kai Waehner, Confluent) Frankfurt 20...
Ā 
Apache Kafka vs. Integration Middleware (MQ, ETL, ESB) - Friends, Enemies or ...
Apache Kafka vs. Integration Middleware (MQ, ETL, ESB) - Friends, Enemies or ...Apache Kafka vs. Integration Middleware (MQ, ETL, ESB) - Friends, Enemies or ...
Apache Kafka vs. Integration Middleware (MQ, ETL, ESB) - Friends, Enemies or ...
Ā 
BBL KAPPA Lesfurets.com
BBL KAPPA Lesfurets.comBBL KAPPA Lesfurets.com
BBL KAPPA Lesfurets.com
Ā 
What's New in AWS Serverless and Containers
What's New in AWS Serverless and ContainersWhat's New in AWS Serverless and Containers
What's New in AWS Serverless and Containers
Ā 
Being Ready for Apache Kafka - Apache: Big Data Europe 2015
Being Ready for Apache Kafka - Apache: Big Data Europe 2015Being Ready for Apache Kafka - Apache: Big Data Europe 2015
Being Ready for Apache Kafka - Apache: Big Data Europe 2015
Ā 
Architecture patterns for distributed, hybrid, edge and global Apache Kafka d...
Architecture patterns for distributed, hybrid, edge and global Apache Kafka d...Architecture patterns for distributed, hybrid, edge and global Apache Kafka d...
Architecture patterns for distributed, hybrid, edge and global Apache Kafka d...
Ā 
Introduction to apache kafka, confluent and why they matter
Introduction to apache kafka, confluent and why they matterIntroduction to apache kafka, confluent and why they matter
Introduction to apache kafka, confluent and why they matter
Ā 
Introduction to Apache Kafka and why it matters - Madrid
Introduction to Apache Kafka and why it matters - MadridIntroduction to Apache Kafka and why it matters - Madrid
Introduction to Apache Kafka and why it matters - Madrid
Ā 
Beyond the Brokers: A Tour of the Kafka Ecosystem
Beyond the Brokers: A Tour of the Kafka EcosystemBeyond the Brokers: A Tour of the Kafka Ecosystem
Beyond the Brokers: A Tour of the Kafka Ecosystem
Ā 
Beyond the brokers - A tour of the Kafka ecosystem
Beyond the brokers - A tour of the Kafka ecosystemBeyond the brokers - A tour of the Kafka ecosystem
Beyond the brokers - A tour of the Kafka ecosystem
Ā 
App modernization on AWS with Apache Kafka and Confluent Cloud
App modernization on AWS with Apache Kafka and Confluent CloudApp modernization on AWS with Apache Kafka and Confluent Cloud
App modernization on AWS with Apache Kafka and Confluent Cloud
Ā 
Data Summer Conf 2018, ā€œBuilding unified Batch and Stream processing pipeline...
Data Summer Conf 2018, ā€œBuilding unified Batch and Stream processing pipeline...Data Summer Conf 2018, ā€œBuilding unified Batch and Stream processing pipeline...
Data Summer Conf 2018, ā€œBuilding unified Batch and Stream processing pipeline...
Ā 
Kafka Explainaton
Kafka ExplainatonKafka Explainaton
Kafka Explainaton
Ā 
Confluent kafka meetupseattle jan2017
Confluent kafka meetupseattle jan2017Confluent kafka meetupseattle jan2017
Confluent kafka meetupseattle jan2017
Ā 

More from confluent

More from confluent (20)

Catch the Wave: SAP Event-Driven and Data Streaming for the Intelligence Ente...
Catch the Wave: SAP Event-Driven and Data Streaming for the Intelligence Ente...Catch the Wave: SAP Event-Driven and Data Streaming for the Intelligence Ente...
Catch the Wave: SAP Event-Driven and Data Streaming for the Intelligence Ente...
Ā 
Santander Stream Processing with Apache Flink
Santander Stream Processing with Apache FlinkSantander Stream Processing with Apache Flink
Santander Stream Processing with Apache Flink
Ā 
Unlocking the Power of IoT: A comprehensive approach to real-time insights
Unlocking the Power of IoT: A comprehensive approach to real-time insightsUnlocking the Power of IoT: A comprehensive approach to real-time insights
Unlocking the Power of IoT: A comprehensive approach to real-time insights
Ā 
Workshop hĆ­brido: Stream Processing con Flink
Workshop hĆ­brido: Stream Processing con FlinkWorkshop hĆ­brido: Stream Processing con Flink
Workshop hĆ­brido: Stream Processing con Flink
Ā 
Industry 4.0: Building the Unified Namespace with Confluent, HiveMQ and Spark...
Industry 4.0: Building the Unified Namespace with Confluent, HiveMQ and Spark...Industry 4.0: Building the Unified Namespace with Confluent, HiveMQ and Spark...
Industry 4.0: Building the Unified Namespace with Confluent, HiveMQ and Spark...
Ā 
AWS Immersion Day Mapfre - Confluent
AWS Immersion Day Mapfre   -   ConfluentAWS Immersion Day Mapfre   -   Confluent
AWS Immersion Day Mapfre - Confluent
Ā 
Eventos y Microservicios - Santander TechTalk
Eventos y Microservicios - Santander TechTalkEventos y Microservicios - Santander TechTalk
Eventos y Microservicios - Santander TechTalk
Ā 
Q&A with Confluent Experts: Navigating Networking in Confluent Cloud
Q&A with Confluent Experts: Navigating Networking in Confluent CloudQ&A with Confluent Experts: Navigating Networking in Confluent Cloud
Q&A with Confluent Experts: Navigating Networking in Confluent Cloud
Ā 
Citi TechTalk Session 2: Kafka Deep Dive
Citi TechTalk Session 2: Kafka Deep DiveCiti TechTalk Session 2: Kafka Deep Dive
Citi TechTalk Session 2: Kafka Deep Dive
Ā 
Build real-time streaming data pipelines to AWS with Confluent
Build real-time streaming data pipelines to AWS with ConfluentBuild real-time streaming data pipelines to AWS with Confluent
Build real-time streaming data pipelines to AWS with Confluent
Ā 
Q&A with Confluent Professional Services: Confluent Service Mesh
Q&A with Confluent Professional Services: Confluent Service MeshQ&A with Confluent Professional Services: Confluent Service Mesh
Q&A with Confluent Professional Services: Confluent Service Mesh
Ā 
Citi Tech Talk: Event Driven Kafka Microservices
Citi Tech Talk: Event Driven Kafka MicroservicesCiti Tech Talk: Event Driven Kafka Microservices
Citi Tech Talk: Event Driven Kafka Microservices
Ā 
Confluent & GSI Webinars series - Session 3
Confluent & GSI Webinars series - Session 3Confluent & GSI Webinars series - Session 3
Confluent & GSI Webinars series - Session 3
Ā 
Citi Tech Talk: Messaging Modernization
Citi Tech Talk: Messaging ModernizationCiti Tech Talk: Messaging Modernization
Citi Tech Talk: Messaging Modernization
Ā 
Citi Tech Talk: Data Governance for streaming and real time data
Citi Tech Talk: Data Governance for streaming and real time dataCiti Tech Talk: Data Governance for streaming and real time data
Citi Tech Talk: Data Governance for streaming and real time data
Ā 
Confluent & GSI Webinars series: Session 2
Confluent & GSI Webinars series: Session 2Confluent & GSI Webinars series: Session 2
Confluent & GSI Webinars series: Session 2
Ā 
Data In Motion Paris 2023
Data In Motion Paris 2023Data In Motion Paris 2023
Data In Motion Paris 2023
Ā 
Confluent Partner Tech Talk with Synthesis
Confluent Partner Tech Talk with SynthesisConfluent Partner Tech Talk with Synthesis
Confluent Partner Tech Talk with Synthesis
Ā 
The Future of Application Development - API Days - Melbourne 2023
The Future of Application Development - API Days - Melbourne 2023The Future of Application Development - API Days - Melbourne 2023
The Future of Application Development - API Days - Melbourne 2023
Ā 
The Playful Bond Between REST And Data Streams
The Playful Bond Between REST And Data StreamsThe Playful Bond Between REST And Data Streams
The Playful Bond Between REST And Data Streams
Ā 

Recently uploaded

+971565801893>>SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHAB...
+971565801893>>SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHAB...+971565801893>>SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHAB...
+971565801893>>SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHAB...
Health
Ā 
TECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service providerTECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service provider
mohitmore19
Ā 
CHEAP Call Girls in Pushp Vihar (-DELHI )šŸ” 9953056974šŸ”(=)/CALL GIRLS SERVICE
CHEAP Call Girls in Pushp Vihar (-DELHI )šŸ” 9953056974šŸ”(=)/CALL GIRLS SERVICECHEAP Call Girls in Pushp Vihar (-DELHI )šŸ” 9953056974šŸ”(=)/CALL GIRLS SERVICE
CHEAP Call Girls in Pushp Vihar (-DELHI )šŸ” 9953056974šŸ”(=)/CALL GIRLS SERVICE
9953056974 Low Rate Call Girls In Saket, Delhi NCR
Ā 
CALL ON āž„8923113531 šŸ”Call Girls Badshah Nagar Lucknow best Female service
CALL ON āž„8923113531 šŸ”Call Girls Badshah Nagar Lucknow best Female serviceCALL ON āž„8923113531 šŸ”Call Girls Badshah Nagar Lucknow best Female service
CALL ON āž„8923113531 šŸ”Call Girls Badshah Nagar Lucknow best Female service
anilsa9823
Ā 
CALL ON āž„8923113531 šŸ”Call Girls Kakori Lucknow best sexual service Online ā˜‚ļø
CALL ON āž„8923113531 šŸ”Call Girls Kakori Lucknow best sexual service Online  ā˜‚ļøCALL ON āž„8923113531 šŸ”Call Girls Kakori Lucknow best sexual service Online  ā˜‚ļø
CALL ON āž„8923113531 šŸ”Call Girls Kakori Lucknow best sexual service Online ā˜‚ļø
anilsa9823
Ā 
Hand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxHand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptx
bodapatigopi8531
Ā 

Recently uploaded (20)

Right Money Management App For Your Financial Goals
Right Money Management App For Your Financial GoalsRight Money Management App For Your Financial Goals
Right Money Management App For Your Financial Goals
Ā 
SyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AI
SyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AISyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AI
SyndBuddy AI 2k Review 2024: Revolutionizing Content Syndication with AI
Ā 
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Try MyIntelliAccount Cloud Accounting Software As A Service Solution Risk Fre...
Ā 
+971565801893>>SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHAB...
+971565801893>>SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHAB...+971565801893>>SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHAB...
+971565801893>>SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHAB...
Ā 
TECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service providerTECUNIQUE: Success Stories: IT Service provider
TECUNIQUE: Success Stories: IT Service provider
Ā 
HR Software Buyers Guide in 2024 - HRSoftware.com
HR Software Buyers Guide in 2024 - HRSoftware.comHR Software Buyers Guide in 2024 - HRSoftware.com
HR Software Buyers Guide in 2024 - HRSoftware.com
Ā 
Software Quality Assurance Interview Questions
Software Quality Assurance Interview QuestionsSoftware Quality Assurance Interview Questions
Software Quality Assurance Interview Questions
Ā 
CHEAP Call Girls in Pushp Vihar (-DELHI )šŸ” 9953056974šŸ”(=)/CALL GIRLS SERVICE
CHEAP Call Girls in Pushp Vihar (-DELHI )šŸ” 9953056974šŸ”(=)/CALL GIRLS SERVICECHEAP Call Girls in Pushp Vihar (-DELHI )šŸ” 9953056974šŸ”(=)/CALL GIRLS SERVICE
CHEAP Call Girls in Pushp Vihar (-DELHI )šŸ” 9953056974šŸ”(=)/CALL GIRLS SERVICE
Ā 
Tech Tuesday-Harness the Power of Effective Resource Planning with OnePlanā€™s ...
Tech Tuesday-Harness the Power of Effective Resource Planning with OnePlanā€™s ...Tech Tuesday-Harness the Power of Effective Resource Planning with OnePlanā€™s ...
Tech Tuesday-Harness the Power of Effective Resource Planning with OnePlanā€™s ...
Ā 
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Ā 
CALL ON āž„8923113531 šŸ”Call Girls Badshah Nagar Lucknow best Female service
CALL ON āž„8923113531 šŸ”Call Girls Badshah Nagar Lucknow best Female serviceCALL ON āž„8923113531 šŸ”Call Girls Badshah Nagar Lucknow best Female service
CALL ON āž„8923113531 šŸ”Call Girls Badshah Nagar Lucknow best Female service
Ā 
Shapes for Sharing between Graph Data SpacesĀ - and Epistemic Querying of RDF-...
Shapes for Sharing between Graph Data SpacesĀ - and Epistemic Querying of RDF-...Shapes for Sharing between Graph Data SpacesĀ - and Epistemic Querying of RDF-...
Shapes for Sharing between Graph Data SpacesĀ - and Epistemic Querying of RDF-...
Ā 
A Secure and Reliable Document Management System is Essential.docx
A Secure and Reliable Document Management System is Essential.docxA Secure and Reliable Document Management System is Essential.docx
A Secure and Reliable Document Management System is Essential.docx
Ā 
CALL ON āž„8923113531 šŸ”Call Girls Kakori Lucknow best sexual service Online ā˜‚ļø
CALL ON āž„8923113531 šŸ”Call Girls Kakori Lucknow best sexual service Online  ā˜‚ļøCALL ON āž„8923113531 šŸ”Call Girls Kakori Lucknow best sexual service Online  ā˜‚ļø
CALL ON āž„8923113531 šŸ”Call Girls Kakori Lucknow best sexual service Online ā˜‚ļø
Ā 
How To Troubleshoot Collaboration Apps for the Modern Connected Worker
How To Troubleshoot Collaboration Apps for the Modern Connected WorkerHow To Troubleshoot Collaboration Apps for the Modern Connected Worker
How To Troubleshoot Collaboration Apps for the Modern Connected Worker
Ā 
call girls in Vaishali (Ghaziabad) šŸ” >ą¼’8448380779 šŸ” genuine Escort Service šŸ”āœ”ļøāœ”ļø
call girls in Vaishali (Ghaziabad) šŸ” >ą¼’8448380779 šŸ” genuine Escort Service šŸ”āœ”ļøāœ”ļøcall girls in Vaishali (Ghaziabad) šŸ” >ą¼’8448380779 šŸ” genuine Escort Service šŸ”āœ”ļøāœ”ļø
call girls in Vaishali (Ghaziabad) šŸ” >ą¼’8448380779 šŸ” genuine Escort Service šŸ”āœ”ļøāœ”ļø
Ā 
5 Signs You Need a Fashion PLM Software.pdf
5 Signs You Need a Fashion PLM Software.pdf5 Signs You Need a Fashion PLM Software.pdf
5 Signs You Need a Fashion PLM Software.pdf
Ā 
Unlocking the Future of AI Agents with Large Language Models
Unlocking the Future of AI Agents with Large Language ModelsUnlocking the Future of AI Agents with Large Language Models
Unlocking the Future of AI Agents with Large Language Models
Ā 
Hand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxHand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptx
Ā 
Optimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVOptimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTV
Ā 

Introducing Kafka's Streams API

  • 1. 1Confidential Introducing Kafkaā€™s Streams API Stream processing made simple Target audience: technical staff, developers, architects Expected duration for full deck: 45 minutes
  • 2. 2Confidential 0.10 Data processing (Streams API) 0.9 Data integration (Connect API) Intra-cluster replication 0.8 Apache Kafka: birthed as a messaging system, now a streaming platform 2012 2014 2015 2016 2017 Cluster mirroring, data compression 0.7 2013
  • 3. 3Confidential Kafkaā€™s Streams API: the easiest way to process data in Apache Kafka Key Benefits of Apache Kafkaā€™s Streams API ā€¢ Build Apps, Not Clusters: no additional cluster required ā€¢ Cluster to go: elastic, scalable, distributed, fault-tolerant, secure ā€¢ Database to go: tables, local state, interactive queries ā€¢ Equally viable for S / M / L / XL / XXL use cases ā€¢ ā€œRuns Everywhereā€: integrates with your existing deployment strategies such as containers, automation, cloud Part of open source Apache Kafka, introduced in 0.10+ ā€¢ Powerful client library to build stream processing apps ā€¢ Apps are standard Java applications that run on client machines ā€¢ https://github.com/apache/kafka/tree/trunk/streams Streams API Your App Kafka Cluster
  • 4. 4Confidential Kafkaā€™s Streams API: Unix analogy $ Ā cat Ā < Ā in.txt | grep ā€œapacheā€ Ā | tr a-Ā­ā€z Ā A-Ā­ā€Z Ā > out.txt Kafka Ā Cluster Connect Ā API Streams Ā API
  • 5. 5Confidential Streams API in the context of Kafka Streams API Your App Kafka Cluster ConnectAPI ConnectAPI OtherSystems OtherSystems
  • 6. 6Confidential When to use Kafkaā€™s Streams API ā€¢ Mainstream Application Development ā€¢ To build core business applications ā€¢ Microservices ā€¢ Fast Data apps for small and big data ā€¢ Reactive applications ā€¢ Continuous queries and transformations ā€¢ Event-triggered processes ā€¢ The ā€œTā€ in ETL ā€¢ <and more> Use case examples ā€¢ Real-time monitoring and intelligence ā€¢ Customer 360-degree view ā€¢ Fraud detection ā€¢ Location-based marketing ā€¢ Fleet management ā€¢ <and more>
  • 7. 7Confidential Some public use cases in the wild & external articles ā€¢ Applying Kafkaā€™s Streams API for internal message delivery pipeline at LINE Corp. ā€¢ http://developers.linecorp.com/blog/?p=3960 ā€¢ Kafka Streams in production at LINE, a social platform based in Japan with 220+ million users ā€¢ Microservices and reactive applications at Capital One ā€¢ https://speakerdeck.com/bobbycalderwood/commander-decoupled-immutable-rest-apis-with-kafka-streams ā€¢ User behavior analysis ā€¢ https://timothyrenner.github.io/engineering/2016/08/11/kafka-streams-not-looking-at-facebook.html ā€¢ Containerized Kafka Streams applications in Scala ā€¢ https://www.madewithtea.com/processing-tweets-with-kafka-streams.html ā€¢ Geo-spatial data analysis ā€¢ http://www.infolace.com/blog/2016/07/14/simple-spatial-windowing-with-kafka-streams/ ā€¢ Language classification with machine learning ā€¢ https://dzone.com/articles/machine-learning-with-kafka-streams
  • 9. 9Confidential Architecture comparison: use case example Real-time dashboard for security monitoring ā€œWhich of my data centers are under attack?ā€
  • 10. 10Confidential Architecture comparison: use case example Other App Dashboard Frontend App Other App 1 Capture business events in Kafka 2 Must process events with separate cluster (e.g. Spark) 4 Other apps access latest results by querying these DBs 3 Must share latest results through separate systems (e.g. MySQL) Before: Undue complexity, heavy footprint, many technologies, split ownership with conflicting priorities Your ā€œJobā€ Other App Dashboard Frontend App Other App 1 Capture business events in Kafka 2 Process events with standard Java apps that use Kafka Streams 3 Now other apps can directly query the latest results With Kafka Streams: simplified, app-centric architecture, puts app owners in control Kafka Streams Your App
  • 13. 13Confidential How do I install the Streams API? ā€¢ There is and there should be no ā€œinstallationā€ ā€“ Build Apps, Not Clusters! ā€¢ Itā€™s a library. Add it to your app like any other library. <dependency> <groupId>org.apache.kafka</groupId> <artifactId>kafka-Ā­ā€streams</artifactId> <version>0.10.1.1</version> </dependency>
  • 14. 14Confidential ā€œBut wait a minute ā€“ whereā€™s THE CLUSTER to process the data?ā€ ā€¢ No cluster needed ā€“ Build Apps, Not Clusters! ā€¢ Unlearn bad habits: ā€œdo cool stuff with data ā‰  must have clusterā€ Ok. Ok. Ok.
  • 15. 15Confidential Organizational benefits: decouple teams and roadmaps, scale people
  • 16. 16Confidential Organizational benefits: decouple teams and roadmaps, scale people Infrastructure Team (Kafka as a shared, multi-tenant service) Fraud detection app Payments team Recommenda tions app Mobile team Security alerts app Operations team ...more apps... ...
  • 17. 17Confidential How do I package, deploy, monitor my apps? How do I ā€¦? ā€¢ Whatever works for you. Stick to what you/your company think is the best way. ā€¢ No magic needed. ā€¢ Why? Because an app that uses the Streams API isā€¦a normal Java app.
  • 19. 19Confidential The API is but the tip of the iceberg API, Ā coding Org. Ā processes Realityā„¢ Deployment Operations Security ā€¦ Architecture Debugging
  • 20. 20Confidential ā€¢ API option 1: DSL (declarative) KStream<Integer, Ā Integer> Ā input Ā = builder.stream("numbers-Ā­ā€topic"); // Ā Stateless Ā computation KStream<Integer, Ā Integer> Ā doubled Ā = input.mapValues(v Ā -Ā­ā€> Ā v Ā * Ā 2); // Ā Stateful Ā computation KTable<Integer, Ā Integer> Ā sumOfOdds = Ā input .filter((k,v) Ā -Ā­ā€> Ā v Ā % Ā 2 Ā != Ā 0) .selectKey((k, Ā v) Ā -Ā­ā€> Ā 1) .groupByKey() .reduce((v1, Ā v2) Ā -Ā­ā€> Ā v1 Ā + Ā v2, Ā "sum-Ā­ā€of-Ā­ā€odds"); The preferred API for most use cases. Particularly appeals to: ā€¢ Fans of Scala, functional programming ā€¢ Users familiar with e.g. Spark
  • 21. 21Confidential ā€¢ API option 2: Processor API (imperative) class Ā PrintToConsoleProcessor implements Ā Processor<K, Ā V> Ā { @Override public Ā void Ā init(ProcessorContext context) Ā {} @Override void Ā process(K Ā key, Ā V Ā value) Ā { Ā  System.out.println("Got Ā value Ā " Ā + Ā value); Ā  } @Override void Ā punctuate(long Ā timestamp) Ā {} @Override void Ā close() Ā {} } Full flexibility but more manual work Appeals to: ā€¢ Users who require functionality that is not yet available in the DSL ā€¢ Users familiar with e.g. Storm, Samza ā€¢ Still, check out the DSL!
  • 22. 22Confidential When to use Kafka Streams vs. Kafkaā€™s ā€œnormalā€ consumer clients Kafka Streams ā€¢ Basically all the time ā€¢ Basically all the time ā€¢ Basically all the time ā€¢ Basically all the time ā€¢ Basically all the time ā€¢ Basically all the time ā€¢ Basically all the time ā€¢ Basically all the time ā€¢ Basically all the time ā€¢ Basically all the time ā€¢ Basically all the time Kafka consumer clients (Java, C/C++, Python, Go, ā€¦) ā€¢ When you must interact with Kafka at a very low level and/or in a very special way ā€¢ Example: When integrating your own stream processing tool (Spark, Storm) with Kafka.
  • 23. 23Confidential Code comparison Featuring Kafka with Streams API <-> Spark Streaming
  • 24. 24Confidential ā€My WordCount is better than your WordCountā€ (?) Kafka Spark These isolated code snippets are nice (and actually quite similar) but they are not very meaningful. In practice, we also need to read data from somewhere, write data back to somewhere, etc.ā€“ but we can see none of this here.
  • 26. 26Confidential Compared to: WordCount in Spark 2.0 1 2 3 Runtime model leaks into processing logic (here: interfacing from Spark with Kafka)
  • 27. 27Confidential Compared to: WordCount in Spark 2.0 4 5 Runtime model leaks into processing logic (driver vs. executors)
  • 32. 32Confidential Streams and Tables Stream Processing meets Databases
  • 35. 35Confidential Key observation: close relationship between Streams and Tables http://www.confluent.io/blog/introducing-kafka-streams-stream-processing-made-simple http://docs.confluent.io/current/streams/concepts.html#duality-of-streams-and-tables
  • 37. 37Confidential Example: Streams and Tables in Kafka Word Count hello 2 kafka 1 world 1 ā€¦ ā€¦
  • 42. 42Confidential Example: continuously compute current users per geo-region 4 7 5 3 2 8 4 7 6 3 2 7 Alice Real-time dashboard ā€œHow many users younger than 30y, per region?ā€ alice Europe user-locations alice Asia, 25y, ā€¦ bob Europe, 46y, ā€¦ ā€¦ ā€¦ alice Europe, 25y, ā€¦ bob Europe, 46y, ā€¦ ā€¦ ā€¦ -1 +1 user-locations (mobile team) user-prefs (web team)
  • 43. 43Confidential Example: continuously compute current users per geo-region KTable<UserId, Ā Location> Ā userLocations = Ā builder.table(ā€œuser-Ā­ā€locations-Ā­ā€topicā€); KTable<UserId, Ā Prefs> Ā  Ā  Ā  Ā userPrefs = Ā builder.table(ā€œuser-Ā­ā€preferences-Ā­ā€topicā€);
  • 44. 44Confidential Example: continuously compute current users per geo-region alice Europe user-locations alice Asia, 25y, ā€¦ bob Europe, 46y, ā€¦ ā€¦ ā€¦ alice Europe, 25y, ā€¦ bob Europe, 46y, ā€¦ ā€¦ ā€¦ KTable<UserId, Ā Location> Ā userLocations = Ā builder.table(ā€œuser-Ā­ā€locations-Ā­ā€topicā€); KTable<UserId, Ā Prefs> Ā  Ā  Ā  Ā userPrefs = Ā builder.table(ā€œuser-Ā­ā€preferences-Ā­ā€topicā€); // Ā Merge Ā into Ā detailed Ā user Ā profiles Ā (continuously Ā updated) KTable<UserId, Ā UserProfile> Ā userProfiles = userLocations.join(userPrefs, Ā (loc, Ā prefs) Ā -Ā­ā€> Ā new Ā UserProfile(loc, Ā prefs)); KTable userProfilesKTable userProfiles
  • 45. 45Confidential Example: continuously compute current users per geo-region KTable<UserId, Ā Location> Ā userLocations = Ā builder.table(ā€œuser-Ā­ā€locations-Ā­ā€topicā€); KTable<UserId, Ā Prefs> Ā  Ā  Ā  Ā userPrefs = Ā builder.table(ā€œuser-Ā­ā€preferences-Ā­ā€topicā€); // Ā Merge Ā into Ā detailed Ā user Ā profiles Ā (continuously Ā updated) KTable<UserId, Ā UserProfile> Ā userProfiles = userLocations.join(userPrefs, Ā (loc, Ā prefs) Ā -Ā­ā€> Ā new Ā UserProfile(loc, Ā prefs)); // Ā Compute Ā per-Ā­ā€region Ā statistics Ā (continuously Ā updated) KTable<UserId, Ā Long> Ā usersPerRegion = Ā userProfiles .filter((userId, Ā profile) Ā  Ā -Ā­ā€> Ā profile.age < Ā 30) .groupBy((userId, Ā profile) Ā -Ā­ā€> Ā profile.location) .count(); alice Europe user-locations Africa 3 ā€¦ ā€¦ Asia 8 Europe 5 Africa 3 ā€¦ ā€¦ Asia 7 Europe 6 KTable usersPerRegion KTable usersPerRegion
  • 46. 46Confidential Example: continuously compute current users per geo-region 4 7 5 3 2 8 4 7 6 3 2 7 Alice Real-time dashboard ā€œHow many users younger than 30y, per region?ā€ alice Europe user-locations alice Asia, 25y, ā€¦ bob Europe, 46y, ā€¦ ā€¦ ā€¦ alice Europe, 25y, ā€¦ bob Europe, 46y, ā€¦ ā€¦ ā€¦ -1 +1 user-locations (mobile team) user-prefs (web team)
  • 48. 48Confidential Streams meet Tables ā€¢ Most use cases for stream processing require both Streams and Tables ā€¢ Essential for any stateful computations ā€¢ Kafka ships with first-class support for Streams and Tables ā€¢ Scalability, fault tolerance, efficient joins and aggregations, ā€¦ ā€¢ Benefits include: simplified architectures, less moving pieces, less Do-It-Yourself work
  • 50. 50Confidential Key features in 0.10 ā€¢ Native, 100%-compatible Kafka integration
  • 51. 51Confidential Native, 100% compatible Kafka integration Read Ā from Ā Kafka Write Ā to Ā Kafka
  • 52. 52Confidential Key features in 0.10 ā€¢ Native, 100%-compatible Kafka integration ā€¢ Secure stream processing using Kafkaā€™s security features
  • 53. 53Confidential Secure stream processing with the Streams API ā€¢ Your applications can leverage all client-side security features in Apache Kafka ā€¢ Security features include: ā€¢ Encrypting data-in-transit between applications and Kafka clusters ā€¢ Authenticating applications against Kafka clusters (ā€œonly some apps may talk to the production clusterā€) ā€¢ Authorizing application against Kafka clusters (ā€œonly some apps may read data from sensitive topicsā€)
  • 54. 54Confidential Configuring security settings ā€¢ In general, you can configure both Kafka Streams plus the underlying Kafka clients in your apps
  • 55. 55Confidential Configuring security settings ā€¢ Example: encrypting data-in-transit + client authentication to Kafka cluster Full demo application at https://github.com/confluentinc/examples
  • 56. 56Confidential Key features in 0.10 ā€¢ Native, 100%-compatible Kafka integration ā€¢ Secure stream processing using Kafkaā€™s security features ā€¢ Elastic and highly scalable ā€¢ Fault-tolerant
  • 60. 60Confidential Key features in 0.10 ā€¢ Native, 100%-compatible Kafka integration ā€¢ Secure stream processing using Kafkaā€™s security features ā€¢ Elastic and highly scalable ā€¢ Fault-tolerant ā€¢ Stateful and stateless computations
  • 61. 61Confidential Stateful computations ā€¢ Stateful computations like aggregations (e.g. counting), joins, or windowing require state ā€¢ State stores are the backbone of state management ā€¢ ā€¦ are local for best performance ā€¢ ā€¦ are backed up to Kafka for elasticity and for fault-tolerance ā€¢ ... are per stream task for isolation ā€“ think: share-nothing ā€¢ Pluggable storage engines ā€¢ Default: RocksDB (a key-value store) to allow for local state that is larger than available RAM ā€¢ You can also use your own, custom storage engine ā€¢ From the user perspective: ā€¢ DSL: no need to worry about anything, state management is automatically being done for you ā€¢ Processor API: direct access to state stores ā€“ very flexible but more manual work
  • 66. 66Confidential Use case: real-time, distributed joins at large scale
  • 67. 67Confidential Use case: real-time, distributed joins at large scale
  • 68. 68Confidential Use case: real-time, distributed joins at large scale
  • 69. 69Confidential Stateful computations ā€¢ Use the Processor API to interact directly with state stores Get Ā the Ā store Use Ā the Ā store
  • 70. 70Confidential Key features in 0.10 ā€¢ Native, 100%-compatible Kafka integration ā€¢ Secure stream processing using Kafkaā€™s security features ā€¢ Elastic and highly scalable ā€¢ Fault-tolerant ā€¢ Stateful and stateless computations ā€¢ Interactive queries
  • 72. 72Confidential Interactive Queries: architecture comparison Kafka Streams App App App App 1 Capture business events in Kafka 2 Process the events with Kafka Streams 4 Other apps query external systems for latest results ! Must use external systems to share latest results App App App 1 Capture business events in Kafka 2 Process the events with Kafka Streams 3 Now other apps can directly query the latest results Before (0.10.0) After (0.10.1): simplified, more app-centric architecture Kafka Streams App
  • 73. 73Confidential Key features in 0.10 ā€¢ Native, 100%-compatible Kafka integration ā€¢ Secure stream processing using Kafkaā€™s security features ā€¢ Elastic and highly scalable ā€¢ Fault-tolerant ā€¢ Stateful and stateless computations ā€¢ Interactive queries ā€¢ Time model
  • 76. 76Confidential Time ā€¢ You configure the desired time semantics through timestamp extractors ā€¢ Default extractor yields event-time semantics ā€¢ Extracts embedded timestamps of Kafka messages (introduced in v0.10)
  • 77. 77Confidential Key features in 0.10 ā€¢ Native, 100%-compatible Kafka integration ā€¢ Secure stream processing using Kafkaā€™s security features ā€¢ Elastic and highly scalable ā€¢ Fault-tolerant ā€¢ Stateful and stateless computations ā€¢ Interactive queries ā€¢ Time model ā€¢ Windowing
  • 78. 78Confidential Windowing ā€¢ Group events in a stream using time-based windows ā€¢ Use case examples: ā€¢ Time-based analysis of ad impressions (ā€number of ads clicked in the past hourā€) ā€¢ Monitoring statistics of telemetry data (ā€œ1min/5min/15min averagesā€) Input data, where colors represent different users events Rectangles denote different event-time windows processing-time event-time windowing alice bob dave
  • 79. 79Confidential Windowing in the DSL TimeWindows.of(3000) TimeWindows.of(3000).advanceBy(1000)
  • 80. 80Confidential Key features in 0.10 ā€¢ Native, 100%-compatible Kafka integration ā€¢ Secure stream processing using Kafkaā€™s security features ā€¢ Elastic and highly scalable ā€¢ Fault-tolerant ā€¢ Stateful and stateless computations ā€¢ Interactive queries ā€¢ Time model ā€¢ Windowing ā€¢ Supports late-arriving and out-of-order data
  • 81. 81Confidential Out-of-order and late-arriving data ā€¢ Is very common in practice, not a rare corner case ā€¢ Related to time model discussion
  • 82. 82Confidential Out-of-order and late-arriving data: example when this will happen Users with mobile phones enter airplane, lose Internet connectivity Emails are being written during the 10h flight Internet connectivity is restored, phones will send queued emails now
  • 83. 83Confidential Out-of-order and late-arriving data ā€¢ Is very common in practice, not a rare corner case ā€¢ Related to time model discussion ā€¢ We want control over how out-of-order data is handled, and handling must be efficient ā€¢ Example: We process data in 5-minute windows, e.g. compute statistics ā€¢ Option A: When event arrives 1 minute late: update the original result! ā€¢ Option B: When event arrives 2 hours late: discard it!
  • 84. 84Confidential Key features in 0.10 ā€¢ Native, 100%-compatible Kafka integration ā€¢ Secure stream processing using Kafkaā€™s security features ā€¢ Elastic and highly scalable ā€¢ Fault-tolerant ā€¢ Stateful and stateless computations ā€¢ Interactive queries ā€¢ Time model ā€¢ Windowing ā€¢ Supports late-arriving and out-of-order data ā€¢ Millisecond processing latency, no micro-batching ā€¢ At-least-once processing guarantees (exactly-once is in the works as we speak)
  • 86. 86Confidential Roadmap outlook for Kafka Streams ā€¢ Exactly-Once processing semantics ā€¢ Unified API for real-time processing and ā€œbatchā€ processing ā€¢ Global KTables ā€¢ Session windows ā€¢ ā€¦ and more ā€¦
  • 88. 88Confidential Where to go from here ā€¢ Kafka Streams is available in Confluent Platform 3.1 and in Apache Kafka 0.10.1 ā€¢ http://www.confluent.io/download ā€¢ Kafka Streams demos: https://github.com/confluentinc/examples ā€¢ Java 7, Java 8+ with lambdas, and Scala ā€¢ WordCount, Interactive Queries, Joins, Security, Windowing, Avro integration, ā€¦ ā€¢ Confluent documentation: http://docs.confluent.io/current/streams/ ā€¢ Quickstart, Concepts, Architecture, Developer Guide, FAQ ā€¢ Recorded talks ā€¢ Introduction to Kafka Streams: http://www.youtube.com/watch?v=o7zSLNiTZbA ā€¢ Application Development and Data in the Emerging World of Stream Processing (higher level talk): https://www.youtube.com/watch?v=JQnNHO5506w
  • 90. 90Confidential Appendix: Streams and Tables A closer look
  • 91. 91Confidential Motivating example: continuously compute current users per geo-region 4 7 5 3 2 8 Real-time dashboard ā€œHow many users younger than 30y, per region?ā€ alice Asia, 25y, ā€¦ bob Europe, 46y, ā€¦ ā€¦ ā€¦ user-locations (mobile team) user-prefs (web team)
  • 92. 92Confidential Motivating example: continuously compute current users per geo-region 4 7 5 3 2 8 Real-time dashboard ā€œHow many users younger than 30y, per region?ā€ alice Europe user-locations alice Asia, 25y, ā€¦ bob Europe, 46y, ā€¦ ā€¦ ā€¦ user-locations (mobile team) user-prefs (web team)
  • 93. 93Confidential Motivating example: continuously compute current users per geo-region 4 7 5 3 2 8 Real-time dashboard ā€œHow many users younger than 30y, per region?ā€ alice Europe user-locations user-locations (mobile team) user-prefs (web team) alice Asia, 25y, ā€¦ bob Europe, 46y, ā€¦ ā€¦ ā€¦ alice Europe, 25y, ā€¦ bob Europe, 46y, ā€¦ ā€¦ ā€¦
  • 94. 94Confidential Motivating example: continuously compute current users per geo-region 4 7 5 3 2 8 4 7 6 3 2 7 Alice Real-time dashboard ā€œHow many users younger than 30y, per region?ā€ alice Europe user-locations alice Asia, 25y, ā€¦ bob Europe, 46y, ā€¦ ā€¦ ā€¦ alice Europe, 25y, ā€¦ bob Europe, 46y, ā€¦ ā€¦ ā€¦ -1 +1 user-locations (mobile team) user-prefs (web team)
  • 95. 95Confidential Same data, but different use cases require different interpretations alice San Francisco alice New York City alice Rio de Janeiro alice Sydney alice Beijing alice Paris alice Berlin
  • 96. 96Confidential Same data, but different use cases require different interpretations alice San Francisco alice New York City alice Rio de Janeiro alice Sydney alice Beijing alice Paris alice Berlin Use Ā case Ā 1: Ā Frequent Ā traveler Ā status? Use Ā case Ā 2: Ā Current Ā location?
  • 97. 97Confidential Same data, but different use cases require different interpretations ā€œAlice has been to SFO, NYC, Rio, Sydney, Beijing, Paris, and finally Berlin.ā€ ā€œAlice is in SFO, NYC, Rio, Sydney, Beijing, Paris, Berlin right now.ā€ āš‘ āš‘ āš‘āš‘ āš‘ āš‘ āš‘ āš‘ āš‘ āš‘āš‘ āš‘ āš‘ āš‘ Use Ā case Ā 1: Ā Frequent Ā traveler Ā status? Use Ā case Ā 2: Ā Current Ā location?
  • 98. 98Confidential Same data, but different use cases require different interpretations alice San Francisco alice New York City alice Rio de Janeiro alice Sydney alice Beijing alice Paris alice Berlin Use Ā case Ā 1: Ā Frequent Ā traveler Ā status? Use Ā case Ā 2: Ā Current Ā location? āš‘ āš‘ āš‘āš‘ āš‘ āš‘āš‘ āš‘
  • 99. 99Confidential Same data, but different use cases require different interpretations alice San Francisco alice New York City alice Rio de Janeiro alice Sydney alice Beijing alice Paris alice Berlin Use Ā case Ā 1: Ā Frequent Ā traveler Ā status? Use Ā case Ā 2: Ā Current Ā location? āš‘ āš‘ āš‘āš‘ āš‘ āš‘āš‘ āš‘
  • 100. 100Confidential Same data, but different use cases require different interpretations alice San Francisco alice New York City alice Rio de Janeiro alice Sydney alice Beijing alice Paris alice Berlin Use Ā case Ā 1: Ā Frequent Ā traveler Ā status? Use Ā case Ā 2: Ā Current Ā location? āš‘ āš‘ āš‘āš‘ āš‘ āš‘āš‘ āš‘
  • 101. 101Confidential Streams meet Tables record stream When you needā€¦ so that the topic is interpreted as a All the values of a key KStream then youā€™d read the Kafka topic into a Example All the places Alice has ever been to with messages interpreted as INSERT (append)
  • 102. 102Confidential Streams meet Tables record stream changelog stream When you needā€¦ so that the topic is interpreted as a All the values of a key Latest value of a key KStream KTable then youā€™d read the Kafka topic into a Example All the places Alice has ever been to Where Alice is right now with messages interpreted as INSERT (append) UPSERT (overwrite existing)
  • 103. 103Confidential Same data, but different use cases require different interpretations ā€œAlice has been to SFO, NYC, Rio, Sydney, Beijing, Paris, and finally Berlin.ā€ ā€œAlice is in SFO, NYC, Rio, Sydney, Beijing, Paris, Berlin right now.ā€ āš‘ āš‘ āš‘āš‘ āš‘ āš‘ āš‘ āš‘ āš‘ āš‘āš‘ āš‘ āš‘ āš‘ Use Ā case Ā 1: Ā Frequent Ā traveler Ā status? Use Ā case Ā 2: Ā Current Ā location? KStream KTable
  • 104. 104Confidential Motivating example: continuously compute current users per geo-region 4 7 5 3 2 8 4 7 6 3 2 7 Alice Real-time dashboard ā€œHow many users younger than 30y, per region?ā€ alice Europe user-locations alice Asia, 25y, ā€¦ bob Europe, 46y, ā€¦ ā€¦ ā€¦ alice Europe, 25y, ā€¦ bob Europe, 46y, ā€¦ ā€¦ ā€¦ -1 +1 user-locations (mobile team) user-prefs (web team)
  • 105. 105Confidential Motivating example: continuously compute current users per geo-region KTable<UserId, Ā Location> Ā userLocations = Ā builder.table(ā€œuser-Ā­ā€locations-Ā­ā€topicā€); KTable<UserId, Ā Prefs> Ā  Ā  Ā  Ā userPrefs = Ā builder.table(ā€œuser-Ā­ā€preferences-Ā­ā€topicā€);
  • 106. 106Confidential Motivating example: continuously compute current users per geo-region alice Europe user-locations alice Asia, 25y, ā€¦ bob Europe, 46y, ā€¦ ā€¦ ā€¦ alice Europe, 25y, ā€¦ bob Europe, 46y, ā€¦ ā€¦ ā€¦ KTable<UserId, Ā Location> Ā userLocations = Ā builder.table(ā€œuser-Ā­ā€locations-Ā­ā€topicā€); KTable<UserId, Ā Prefs> Ā  Ā  Ā  Ā userPrefs = Ā builder.table(ā€œuser-Ā­ā€preferences-Ā­ā€topicā€); // Ā Merge Ā into Ā detailed Ā user Ā profiles Ā (continuously Ā updated) KTable<UserId, Ā UserProfile> Ā userProfiles = userLocations.join(userPrefs, Ā (loc, Ā prefs) Ā -Ā­ā€> Ā new Ā UserProfile(loc, Ā prefs)); KTable userProfilesKTable userProfiles
  • 107. 107Confidential Motivating example: continuously compute current users per geo-region KTable<UserId, Ā Location> Ā userLocations = Ā builder.table(ā€œuser-Ā­ā€locations-Ā­ā€topicā€); KTable<UserId, Ā Prefs> Ā  Ā  Ā  Ā userPrefs = Ā builder.table(ā€œuser-Ā­ā€preferences-Ā­ā€topicā€); // Ā Merge Ā into Ā detailed Ā user Ā profiles Ā (continuously Ā updated) KTable<UserId, Ā UserProfile> Ā userProfiles = userLocations.join(userPrefs, Ā (loc, Ā prefs) Ā -Ā­ā€> Ā new Ā UserProfile(loc, Ā prefs)); // Ā Compute Ā per-Ā­ā€region Ā statistics Ā (continuously Ā updated) KTable<UserId, Ā Long> Ā usersPerRegion = Ā userProfiles .filter((userId, Ā profile) Ā  Ā -Ā­ā€> Ā profile.age < Ā 30) .groupBy((userId, Ā profile) Ā -Ā­ā€> Ā profile.location) .count(); alice Europe user-locations Africa 3 ā€¦ ā€¦ Asia 8 Europe 5 Africa 3 ā€¦ ā€¦ Asia 7 Europe 6 KTable usersPerRegion KTable usersPerRegion
  • 108. 108Confidential Motivating example: continuously compute current users per geo-region 4 7 5 3 2 8 4 7 6 3 2 7 Alice Real-time dashboard ā€œHow many users younger than 30y, per region?ā€ alice Europe user-locations alice Asia, 25y, ā€¦ bob Europe, 46y, ā€¦ ā€¦ ā€¦ alice Europe, 25y, ā€¦ bob Europe, 46y, ā€¦ ā€¦ ā€¦ -1 +1 user-locations (mobile team) user-prefs (web team)
  • 109. 109Confidential Another common use case: continuous transformations ā€¢ Example: to enrich an input stream (user clicks) with side data (current user profile) KStream alice /rental/p8454vb, 06:59 PM PDT user-clicks-topics (at 1M msgs/s) ā€œfactsā€
  • 110. 110Confidential Another common use case: continuous transformations ā€¢ Example: to enrich an input stream (user clicks) with side data (current user profile) KStream alice /rental/p8454vb, 06:59 PM PDT alice Asia, 25y bob Europe, 46y ā€¦ ā€¦ KTable user-profiles-topic user-clicks-topics (at 1M msgs/s) ā€œfactsā€ ā€œdimensionsā€
  • 111. 111Confidential Another common use case: continuous transformations ā€¢ Example: to enrich an input stream (user clicks) with side data (current user profile) KStream alice /rental/p8454vb, 06:59 PDT, Asia, 25y stream.JOIN(table) alice /rental/p8454vb, 06:59 PM PDT alice Asia, 25y bob Europe, 46y ā€¦ ā€¦ KTable user-profiles-topic user-clicks-topics (at 1M msgs/s) ā€œfactsā€ ā€œdimensionsā€
  • 112. 112Confidential Another common use case: continuous transformations ā€¢ Example: to enrich an input stream (user clicks) with side data (current user profile) KStream alice /rental/p8454vb, 06:59 PDT, Asia, 25y stream.JOIN(table) alice /rental/p8454vb, 06:59 PM PDT alice Asia, 25y bob Europe, 46y ā€¦ ā€¦ KTable alice Europe, 25y bob Europe, 46y ā€¦ ā€¦alice Europe new update for alice from user-locations topic user-profiles-topic user-clicks-topics (at 1M msgs/s) ā€œfactsā€ ā€œdimensionsā€
  • 116. 116Confidential Interactive Queries New Ā API Ā to Ā access local Ā state Ā stores Ā of an Ā app Ā instance charlie 3bob 5 alice 2
  • 117. 117Confidential Interactive Queries New Ā API Ā to Ā discover running Ā app Ā instances charlie 3bob 5 alice 2 ā€œhost1:4460ā€ ā€œhost5:5307ā€ ā€œhost3:4777ā€
  • 118. 118Confidential Interactive Queries You: Ā inter-Ā­app Ā communication Ā (RPC Ā layer)