This document provides a tutorial on recurrent neural networks (RNNs) and long short-term memory (LSTM) networks. It begins with introductions to the speaker and an overview of the content. It then explains RNNs and how they work sequentially through hidden layers. Issues like vanishing gradients are discussed. LSTMs are introduced as an advanced RNN that can retain information over longer periods of time using gates. Pre-trained word embeddings like Word2Vec, GloVe, and FastText are briefly explained. Finally, homework is assigned to build a sentiment analysis model using an LSTM and pre-trained word embeddings on a Chinese text dataset.