This document provides an overview of sequence modeling using recurrent and recursive neural networks. It begins with an introduction to classical dynamical systems and unfolding computational graphs for recurrent networks. It then discusses different types of recurrent networks, including those with recurrence through the hidden and output states. Bidirectional and encoder-decoder sequence-to-sequence architectures are also covered. Finally, the document discusses issues like exploding gradients and presents solutions like LSTMs and gradient clipping. Recursive networks and networks with explicit memory components are also introduced.