Recurrent Neural Networks (RNNs) can be used to generate text that will look like its original training data. There are many articles out there that show hilarious end-results of such adventures, but start-from-scratch walkthroughs that show the raw code, like this one, are hard to come by. This presentation will demonstrate what you need to join in, grab your own data set, process it, train it, and sample it. Training the data on a CPU can take hours, but in this session you can learn how training on a GPU with hardware acceleration takes only seconds. Come away from this session with your own datasets to “randomly” generate new bodies of text!