Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Message queue demo

175 views

Published on

Messaging demo which illustrates basic use cases of Apache Kafka and RabbitMQ

Published in: Education
  • Be the first to comment

  • Be the first to like this

Message queue demo

  1. 1. Message Queues Demo
  2. 2. Apache Kafka
  3. 3. Start Start zookeeper: bin/zookeeper-server-start.sh config/zookeeper.properties Start brokers: bin/kafka-server-start.sh config/server.properties bin/kafka-server-start.sh config/server1.properties bin/kafka-server-start.sh config/server2.properties
  4. 4. Create a topic bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 3 --partitions 1 --topic demo bin/kafka-topics.sh --list --zookeeper localhost:2181
  5. 5. Create a producer bin/kafka-console-producer.sh --broker-list localhost:9092 --topic demo
  6. 6. Create a consumer bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --from- beginning --topic demo
  7. 7. Kill a broker bin/kafka-topics.sh --describe --zookeeper localhost:2181 --topic demo Then kill a leader broker… bin/kafka-topics.sh --describe --zookeeper localhost:2181 --topic demo Check available messages… bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --from- beginning --topic demo
  8. 8. Kafka Streams for data processing Let’s create a file… echo -e "all streams lead to kafkanhello kafka streamsnjoin kafka summit" > file-input.txt ...and then create a topic… bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 -- partitions 1 --topic streams-file-input ...and publish data to this topic… bin/kafka-console-producer.sh --broker-list localhost:9092 --topic streams-file- input < file-input.txt
  9. 9. Kafka Streams for data processing Let’s run an analytics… bin/kafka-run-class.sh org.apache.kafka.streams.examples.wordcount.WordCountDemo And see results in output topic: bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic streams- wordcount-output --from-beginning --property print.key=true --property value.deserializer=org.apache.kafka.common.serialization.LongDeserializer
  10. 10. Kafka Streams for data processing WordCountDemo: KTable wordCounts = textLines // Split each text line, by whitespace, into words. .flatMapValues(value -> Arrays.asList(value.toLowerCase().split("W+"))) // Ensure the words are available as record keys for the next aggregate operation. .map((key, value) -> new KeyValue<>(value, value)) // Count the occurrences of each word (record key) and store the results into a table named "Counts". .countByKey("Counts")
  11. 11. RabbitMQ
  12. 12. Start sbin/rabbitmq-server -detached nano etc/rabbitmq/rabbitmq.config sbin/rabbitmqctl status Web UI: sbin/rabbitmq-plugins enable rabbitmq_management http://localhost:15672/ Management HTTP API: http://localhost:15672/api/
  13. 13. Sending data
  14. 14. Receive data
  15. 15. What happened
  16. 16. Routing
  17. 17. Exchange types Exchanges - entities where messages are sent. They take a message and route it into zero or more queues. The routing algorithm used depends on the exchange type and rules called bindings. Types: Direct Fanout Topic Headers
  18. 18. Direct exchange
  19. 19. Fanout exchange
  20. 20. Topic Exchange
  21. 21. Clients
  22. 22. Q&A

×