This document discusses different methods for creating word embeddings from text data, including bag-of-words, continuous bag-of-words, weighted continuous bag-of-words, and convolutional neural networks. Bag-of-words represents words as one-hot encodings but does not capture similarity between words. Continuous bag-of-words learns distributed word representations but treats all words equally. Weighted continuous bag-of-words aims to address this by learning word importance weights. Convolutional neural networks apply convolutions to word vectors to learn local contextual representations of words. These methods are evaluated on a text classification task, with convolutional neural networks achieving the best performance.