This document provides a summary of Letgo's data platform. It discusses how Letgo ingests 500GB of data daily from various event types and processes over 530 million events daily with near real-time processing. It describes Letgo's data journey in moving to an event-driven architecture and storing data in S3. Key components of Letgo's data platform are discussed like stream processing for real-time user segmentation, batch processing for geo data enrichment, querying data using Spark SQL, and data exploitation tools. The document also provides some tips for orchestrating jobs and monitoring metrics.