- QRNN is a new type of recurrent neural network that can process sequential data in parallel like a CNN, making it up to 16 times faster than LSTM for training and testing. It approximates the hidden state of RNNs using previous input values rather than the previous hidden state, allowing parallel computation. - The paper proposes the QRNN model, which uses masked convolutions and pooling to enable parallel processing of sequences. It evaluates QRNN on sentiment classification, language modeling, and character-level machine translation, finding it can achieve similar performance to LSTMs while being much faster to train.