This document discusses using recurrent neural networks (RNNs) to generate text, specifically novels. It provides an overview of RNNs and how they were trained on small datasets of text to generate new text in the style of the source material. The results showed that after millions of iterations, the RNN-generated text began to resemble the source text in terms of vocabulary, grammar and flow, but was not identical. Other examples discussed include using RNNs trained on Wikipedia text, the Bible, and Obama's speeches.