This document provides an overview of natural language processing (NLP) techniques for beginners. It discusses early methods like one-hot encoding and bag-of-words representations that ignored word order. Next, it describes the curse of dimensionality and how word embeddings like Word2Vec addressed it by assigning vectors to words based on their contexts. Later techniques discussed include GloVe, fastText, ELMo, BERT, and memory networks like LSTMs. The document concludes by noting that NLP has made exciting progress but remains a challenging field.