Successfully reported this slideshow.
Your SlideShare is downloading. ×

Cisco’s E-Commerce Transformation Using Kafka

Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Loading in …3
×

Check these out next

1 of 25 Ad

Cisco’s E-Commerce Transformation Using Kafka

Download to read offline

(Gaurav Goyal + Dharmesh Panchmatia, Cisco Systems) Kafka Summit SF 2018

Cisco e-commerce platform is a custom-built mission-critical platform which accounts for $40+ billion of Cisco’s revenue annually. It’s a suite of 35 different applications and 300+ services that powers product configuration, pricing, quoting and order booking across all Cisco product lines including hardware, software, services and subscriptions. It’s a B2B platform used by the Cisco sales team, partners and direct customers, serving 140,000 unique users across the globe. In order to improve customer experience and business agility, Cisco decided to transition the platform to cloud-native technologies, MongoDB, Elasticsearch and Kafka.

In this session, we will share details around:
-Kafka architecture
-How we are experiencing significant resiliency advantages, zero-downtime deployment and improved performance
-How we’ve implemented Kafka to pass data to 20+ downstream applications, removing point-to-point integrations, batch jobs and standardizing the handshake
-How are we using Kafka for pushing data for machine learning and analytics use cases
-Best practices and lessons learned

(Gaurav Goyal + Dharmesh Panchmatia, Cisco Systems) Kafka Summit SF 2018

Cisco e-commerce platform is a custom-built mission-critical platform which accounts for $40+ billion of Cisco’s revenue annually. It’s a suite of 35 different applications and 300+ services that powers product configuration, pricing, quoting and order booking across all Cisco product lines including hardware, software, services and subscriptions. It’s a B2B platform used by the Cisco sales team, partners and direct customers, serving 140,000 unique users across the globe. In order to improve customer experience and business agility, Cisco decided to transition the platform to cloud-native technologies, MongoDB, Elasticsearch and Kafka.

In this session, we will share details around:
-Kafka architecture
-How we are experiencing significant resiliency advantages, zero-downtime deployment and improved performance
-How we’ve implemented Kafka to pass data to 20+ downstream applications, removing point-to-point integrations, batch jobs and standardizing the handshake
-How are we using Kafka for pushing data for machine learning and analytics use cases
-Best practices and lessons learned

Advertisement
Advertisement

More Related Content

Slideshows for you (20)

Similar to Cisco’s E-Commerce Transformation Using Kafka (20)

Advertisement

More from confluent (20)

Recently uploaded (20)

Advertisement

Cisco’s E-Commerce Transformation Using Kafka

  1. 1. Cisco's eCommerce Transformation using Kafka Presented By: Dharmesh Panchmatia (Sr. Director – Cisco Systems) Gaurav Goyal (Principal Architect – Cisco Systems)
  2. 2. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential Agenda Kafka Architecture2 1 Kafka Use Cases Kafka Monitoring3 Lessons Learnt4
  3. 3. © 2018 Cisco and/or its affiliates. All rights reserved. Cisco Confidential Orders booked $50+B 138 Countries 63 Device types 16 Browsers 185KUsers16 Languages 6M Hits/day 6.9 M Estimates 5.3 M Quotes 1.9 M Orders 85.6% Orders Orders Autobook Portal 71% B2B 29% Cisco Commerce By The Numbers
  4. 4. © 2018 Cisco and/or its affiliates. All rights reserved. Cisco Confidential Ref Data REFERENCE DATA SOURCE DMPRD - RDBMS Logging Order Capture DC1 - Tomcat Order Capture Transaction Data Downstream Publish X-Functional Services (73) DC3DC2DC1 TRANSACTION DATA STORE P S S S S N1 N2 N3 N4 N5 DC2 - Tomcat 1 2 3 4 Addresses Items Preferences Roles Contacts Logging DC1 & DC2 Commerce – Cloud Native
  5. 5. Kafka Use Cases
  6. 6. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential Kafka – Use Cases Data push to downstreams 1. Avoid point to point integration. 2. Avoid direct connection to transactional DB. Elastic Search Data Push 1. Reduce load on transactional DB 2. Eliminates ES out of sync in multi-DCs Machine Learning Use Cases using Spark 1. Recommendation Engine 2. Most Popular Configurations 3. Most popular products for a given category
  7. 7. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential Customer who bought X also bought Y. Identify products which are mostly bought together so we can create bundles or promotions accordingly. 1 2 Algorithm: Apriori ML Use Case 1. Recommendation Engine
  8. 8. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential Provide visibility to most popular configurations for a given product. Provide visibility to a configuration which Customer has recently bought for the given product. 1 2 Allow selection of pre-configured products instead of starting from scratch. ML Use Case 2. Popular Product Configuration
  9. 9. Kafka Architecture
  10. 10. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential Producer (Capture Order) Producer (Return Order) Broker 1 Broker 2 Broker 3 Broker 4 ZK - 1 ZK - 2 ZK - 3 ZK - 4 ZK - 5 Consumer (Smart- SW SC) Kafka Cluster Zookeeper DC1 DC2 DC1 DC2 DC3 DC1 – RCDN; DC2 – ALLEN; DC3 – RTP Coordinates cluster membership Commit Offset (v 0.10.x.x) Kafka Architecture Consumer (EDW)
  11. 11. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential RDBMS ProducerCustom Code (DC1) Custom Code (DC2)FAULT TOLERENT Kafka DC1 and DC2 Consumer Group - DC1 Consumer Group – DC2 Elastic Search – DC1 Elastic Search – DC2 Kafka Architecture – Elastic Search
  12. 12. Transaction Data Order, Estimate Quote RDBMS Reference Data Click Stream Data Data Visualization Dynamic Querying Data Science Primary Analytics Data Store MQL Transaction Data RDBMS Subscriptions Invoices Kafka Architecture – ML & Analytics Use Case
  13. 13. Kafka Monitoring
  14. 14. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential Monitoring: Kafka Manager and Kafdrop Kafka Manager Kafdrop
  15. 15. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential Kafka – Custom Scripts 1. Cron job to check Kafka processes every minute. Restart Kafka process (and send email) in case it’s not running. 2. Always take back up of logs systematically when Kafka processes are getting restarted. 3. Have a test topic and push test message every minute. Trigger a notification in case of failures.
  16. 16. Best Practices / Lessons Learnt
  17. 17. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential Best Practices 1 Have a mechanism to reset Kafka offsets on demand. 4 Auto Re-push mechanism in case producer gets error while pushing data into Kafka 2 Have a mechanism to re-push data to Kafka topic, 3 Enable SSL for secure access
  18. 18. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential UI – Reset Offsets
  19. 19. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential UI – Re-push data
  20. 20. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential Kafka Producer & Consumer Setup with SSL Below properties are required to enable SSL for both Producer and Consumer If client authentication is not required in the broker then below configuration is suffice, (kafka.client.truststore.jks will be provided by kafka service host.) 1 If client authentication is required in the broker then below configuration is required. (kafka.client.keystore.jks will be provided by kafka service host. ) 2
  21. 21. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential Auto Re-Push Mechanism Failure Source Data Push In case of failures Offline Scheduler Failed records Re - Push
  22. 22. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential Lessons Learnt 1 Have a while loop while subscribing to any Kafka Topics instead of creating consumer every time. 4 Data Size - consumer's max.partition.fetch.bytes should be greater or equals to the producers producer.max.request.size Default is 1MB. 2 Always use key if you want all messages for a particular key (e.g. order id) always goes to a particular partition. 3 enable.auto.commit - Default is true. It is better to set it false to get control over when to commit the offset.
  23. 23. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential Lessons Learnt 5 Have a custom script deployed to monitor & restart Kafka nodes in case of any issues. 8 Reset offset: Make sure there is no active consumer on this topic for that consumer group. 6 heartbeat.interval.ms must be smaller than session.timeout.ms. session.timeout.ms : it controls the time it takes to detect a consumer crash and stop sending heartbeats. heartbeat.interval.ms :The expected time between heartbeats to the consumer 7 auto.offset.reset -default latest
  24. 24. Questions and Answers
  25. 25. © 2017 Cisco and/or its affiliates. All rights reserved. Cisco Confidential Kafka Architecture – ML Use Case Quote Stream Order Stream

×