One of challenges we face at DoorDash everyday is to keep our API latency low. While the problem sounds simple on the surface, it gets interesting some times. One of our endpoints that serves restaurant menus to our consumers had high p99 latency numbers. Since it’s a high traffic endpoint we naturally use cache pretty intensively. We cache our serialized menus in Redis to avoid repeated calls to DB, and spread out the read traffic load. By the end of this post we will present you how we used compression to not only improve our latency, but also got ourselves more space to cache.