The document discusses SkipRNN and RepeatRNN, which are recurrent neural network models that can dynamically control computation. SkipRNN learns to skip state updates in RNNs by introducing a binary gate that decides whether to update or copy the state. It was shown to achieve better accuracy with less computation on sequential tasks. RepeatRNN reproduces adaptive computation time, allowing the network to repeat computations for difficult examples. It was found to train and infer faster than baselines on an addition task. Code for both models is available online in PyTorch and TensorFlow.