The document discusses how gated recurrent neural networks like LSTMs and GRUs can provide quasi-invariance to time transformations in input data through their gating mechanisms. It proposes a new initialization method called "Chrono initialization" that sets the gate biases depending on the expected range of temporal dependencies in the data. Experiments on synthetic tasks show that LSTMs with Chrono initialization learn long-term dependencies better than standard initialization and are more robust to random time warpings in the inputs.