This document provides an introduction to natural language processing (NLP). It discusses how IBM Watson works by extracting relevant information from questions using named entity recognition, relation extraction, and information retrieval to find answers. Next, it describes how text generation works using Markov chains, hidden Markov models, conditional random fields, recurrent neural networks, and long short-term memory networks to predict the next character or word. Finally, it notes some limitations of current AI and provides extra resources for further reading.