- word2vec is a neural network model that learns word embeddings from large amounts of text by predicting words from their context. It has two implementations: Continuous Bag of Words (CBOW) and Skip-Gram. - CBOW predicts a target word based on surrounding context words, while Skip-Gram predicts surrounding context words given the target word. - Both models are trained using backpropagation and stochastic gradient descent to maximize the log-likelihood of predicting correct word-context pairs in a corpus.