Word embeddings are commonly used in NLP tasks but embedding phrases while maintaining semantic meaning has been challenging. The authors present a novel method using Siamese neural networks to embed words and multi-word units in the same vector space. The model learns to generate phrase representations based on their semantic similarity to single words. It is trained on a dataset to predict similarity between words and phrases and outperforms previous models on phrase similarity and composition tasks.