Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Streaming with Spring Cloud Stream and Apache Kafka - Soby Chacko

122 views

Published on

Streaming with Spring Cloud Stream and Apache Kafka with Soby Chacko at SpringOne Tour 2019

Published in: Software
  • Be the first to comment

  • Be the first to like this

Streaming with Spring Cloud Stream and Apache Kafka - Soby Chacko

  1. 1. Streaming with Spring and Apache Kafka Spring Cloud Stream Apache Kafka Kafka Streams
  2. 2. About me… Soby Chacko Committer - Spring Cloud Stream/Spring Cloud Data Flow Twitter: @sobychacko Github: github.com/sobychacko
  3. 3. Spring Cloud Stream Overview • General purpose framework for writing event driven/stream processing micro services • Destination based bindings on your choice of middleware
  4. 4. Persistent publish-subscribe
  5. 5. Consumer groups
  6. 6. Partitioning Support
  7. 7. Apache Kafka • Single Source of Truth for data • Fault Tolerant • Based on the ubiquitous log data structure • Producer/Consumer/Streams API • Transactions • Exactly once semantics processing guarantees
  8. 8. Application Core Spring Boot Spring Kafka Spring Integration/Kafka Streams Spring Cloud Stream Inputs Outputs Spring Cloud Stream Application Model
  9. 9. Application Core Spring Cloud Stream Application Inputs Outputs Binder Middleware Zooming in on the Application Model
  10. 10. Spring Cloud Stream Event-Driven microservices framework Pluggable Binder Implementations Opportunities: Same code; Same tests; Drop-in replacement for a variety of Message Brokers Rabbit MQ Apache Kafka Google PubSub Amazon Kinesis Azure Event Hubs Solace
  11. 11. Spring Cloud Stream Application level features • Message conversions are handled by the framework • Support for error handling - DLQ • Auto provisioning of topics (destinations) • Schema evolution • Health indicators • And a whole lot of other convenient features….
  12. 12. Spring Cloud Stream Programming model • Java 8 based functional programming model • Spring Cloud Function support as the foundation • Reactive programming support using Project Reactor • Framework provided annotations based programming (EnableBinding, StreamListener)
  13. 13. Spring Cloud Stream application types • Sources - java.util.function.Supplier • Sinks - java.util.function.Consumer • Processors - java.util.function.Function
  14. 14. A simple processor @SpringBootApplication … @Bean public Function<String, String> toUpperCase() { return String::toUpperCase; }
  15. 15. A simple source @SpringBootApplication … @Bean public Supplier<Long> currentTime() { return System::currentTimeMillis; }
  16. 16. A simple sink @SpringBootApplication … @Bean public Consumer<String> cons() { return x -> { System.out.println("Consumer"); }; }
  17. 17. Stream processing with Spring Cloud Stream and Kafka Streams
  18. 18. “We call an application data- intensive if data is its primary challenge—the quantity of data, the complexity of data, or the speed at which it is changing.”
  19. 19. Kafka Streams • Client library • No need for dedicated processing cluster • All guarantees of Kafka are applicable • Per record processing • Stateful stream processing • And many more features…
  20. 20. Spring Cloud Stream with Kafka Streams Spring Boot Application Kafka Streams binder from Spring Cloud Stream Kafka Streams library Kafka Broker Spring Kafka
  21. 21. Major types available for stream processing • KStream • KTable • GlobalKTable Spring Cloud Stream provides binding capabilities for all these three types.
  22. 22. Stream processing concepts • Stream-Table duality • Time based windowing • Joins and Aggregation • Stateless map and filter like operations on data
  23. 23. Stateful Stream Processing • Kafka Streams provide built in capabilities for stateful stream processing • Default state store - RocksDB • Spring Cloud Stream touch points for interactive queries in Kafka Streams
  24. 24. Spring Cloud Stream programming model for Kafka Streams apps • Java 8 based functional programming model • Processors can be written as java.util.function.Function or java.util.function.Consumer. • Multiple input/output binding capabilities
  25. 25. Kafka Streams app as a function… @Bean public Function<KStream<Object, String>, KStream<?, WordCount>> process() { return input -> input .flatMapValues(value -> Arrays.asList(value.toLowerCase().split("W+"))) .map((key, value) -> new KeyValue<>(value, value)) .groupByKey(Serialized.with(Serdes.String(), Serdes.String())) .windowedBy(TimeWindows.of(5000)) .count(Materialized.as("foo-WordCounts")) .toStream() .map((key, value) -> new KeyValue<>(null, new WordCount(key.key(), value, new Date(key.window().start()), new Date(key.window().end())))); }
  26. 26. Quick Demo…
  27. 27. Resources • Code used for the demo • Samples Repository • Project page • Dataflow microsite • Confluent Blogs on Spring with Kafka:
 Part 1 | Part 2 | Part 3 | Part 4
  28. 28. Questions? Comments?

×