Confluent and Couchbase – Event Streaming Platform + NoSQL combined. This slide deck introduces Apache Kafka as event streaming platform and how to leverage Kafka Connect to integrate with Couchbase.
Sample Best Fit Use Cases
Services requiring low latency, highly available and scalable data ingestion or presentation tier with onward transport of data.
Serving data with high availability to a high multiplicity of readers (up to millions) with deterministic low latency.
Services that wish to transform streaming data and quickly store intermediate state for further processing.
Services storing or processing a high cardinality of entities or with rapid schema evolution.
Services with operational data storage requirements up to the 10s of Terabytes.
Examples of typical applications requiring these functionalities:
Recommendation engines, predictive analytics engines, fraud detection frameworks, risk analytics engines, trader toolkits, real-time trade blotters.
Kafka Connect Couchbase Connector
Stream, filter, and transform events to and from Couchbase with Source and Sink connectors.
Fast, reliable and fault tolerant: Based on DCP (Couchbase replication protocol).
Efficient: Only load new or modified documents.
Real-time: Every mutation to Couchbase generates an event which is published to a Kafka topic.
End-to-End monitoring: Integrated with Confluent Control Center:
Kafka is de-facto standard for data movement
Unified control, monitoring, and metrics