6. Daggers
● Started with one use case - Dynamic Surge Pricing
● Hand coded Flink and Kafka Stream jobs in 4 weeks
● 20 other use cases in pipeline
● We had to Do something better
9. Daggers
● Generic Flink Job
● Feeds data from Kafka
● Deserializes protobuf messages with the help of
our proto schema registry
● Can process upto 2 streams
● Aggregated data from stream(s) is sent to a sink
● Support for custom UDFs and Post Processors
12. Sinks
Influx Sink
● Default data sink
● Integrated with grafana - used for monitoring
& alerting other business metrics
13. Sinks
Kafka Sink
● Publish to Kafka topic with a Proto mapping
● Another DIY tool to sink from Kafka to one of
the following:
○ Services - HTTP or GRPC
○ DB - relational OR time series
○ Redis and Elasticsearch
○ Analytics platforms - Clevertap or
Mixpanel
○ Log - for debugging
17. Alerting
● Automated alerts.
● Users are provided with a Health
dashboard.
● Alerts are sent to specific teams via
their slack channels and pager duties.
18. Use Cases
○ Fraud Management
○ Supply Availability
○ System Monitoring Metrics
○ Driver Allocation
○ And Many more...
20. ● Spanned over 26 Flink Cluster
● Most of it created by analysts
● Actively used for monitoring
● Dashboards created are used by city
heads
● Backbone of many services like fraud
and surge
REAL TIME DAGGERS IN PRODUCTION REAL TIME DAGGERS
● Single Form to create DAGGER
● The data can be sent to a sink
● Data ready to be consumed as soon as generated
● Real time data analysis across all cluster
● Processed data is sent to one of the sinks
DATA PROCESSED EVERYDAY
300+ 2 min 1+ TB
21. Impact
6+ Billion
Messages/day01
● For system uptime
● Across 500
microservices
44,000
geolocation02
● For dynamic surge
pricing
● Demand & supply
44,000
geolocation02
● For dynamic surge
pricing
● Demand & supply
User segmentation &
Real-time triggers04
● For growth campaign
● 26% better conversion
22. Thank You !!!
Let’s talk.(we are hiring!)
Arujit Pradhan
Twitter : @james_bondu
Rasyid Hakim
Linkedin : https://bit.ly/2NAxWC2