This document discusses vector space word representations, which map words to dense numerical vectors based on their contexts. Words that appear in similar contexts are mapped to vectors with small angles between them. Historical methods included hard and soft clustering, while modern techniques use neural networks like Word2Vec. These representations allow words with similar meanings to have similar vectors, and can be used to solve word analogy problems by performing vector arithmetic. While not perfect, vector representations have improved many NLP tasks such as named entity recognition and sentiment analysis.