The document discusses several papers on neural machine translation (NMT). It summarizes key points from various seminars on multilingual NMT models. The summaries include: 1) A multilingual NMT system introduces an artificial token to specify the target language, enabling translation between multiple languages with one model. 2) Multilingual models show comparable or state-of-the-art performance on several language pairs. They also enable zero-shot translation between language pairs without direct training data. 3) Additional experiments explore mixing various language pairs for training and the effects on translation quality, particularly for low-resource languages.