This paper evaluates two clipping-based filtering methods for reducing the peak to average power ratio (PAPR) in OFDM signals, essential for enhancing the performance of LTE wireless communications. The study highlights the trade-offs between PAPR reduction and metrics like bit error rate (BER) and computational complexity. Simulation results demonstrate that the proposed method offers significant PAPR reduction while managing BER performance compared to existing approaches.