This document discusses Airbnb's use of Kafka as the foundation for its highly reliable logging system. It describes the types of data Airbnb collects, including product events, database exports, service events, and derived data. Airbnb uses a simple logging pipeline where events are delivered reliably through Kafka in real time. Key components of its production logging pipeline include Jitney for standardized messaging, a central schema repository, client SDKs, producer and consumer agents, and a self-service portal. The pipeline provides continuous integration through schema authoring, deployment, implementation, processing and storage, and monitoring. It handles large volumes of data reliably with 150 brokers processing 1 million messages per second and 10 billion events collected daily with very low