This paper explores the tolerable timing mismatch levels in time-interleaved analog-to-digital converters (TI-ADCs) utilized in high-speed orthogonal frequency division multiplexing (OFDM) communication systems. It provides a theoretical framework for determining the maximum tolerable mismatch levels to minimize performance degradation, particularly focusing on achieving bit error rates (BER) below a specified threshold. The findings are supported by simulations, demonstrating that keeping mismatch levels under 25% of a defined threshold results in acceptable performance degradation.