This document summarizes an improved approach to vault-based tokenization that can boost vault lookup performance. It discusses how tokenization works by substituting sensitive data like payment card or health information with meaningless tokens. Traditionally, vault-based tokenization stores the mapping between sensitive data and tokens in centralized databases or "vaults". However, as vaults grow in size with more transactions, lookup latency and scalability become challenges. The proposed solution addresses this by distributing vault data across partitions without needing to sync contents. It also utilizes a tokenization engine to uniformly fill databases, reducing latency and improving scalability for large-scale vault-based tokenization systems.