Pretraining and Language Modeling
Pre-training: An unsupervised learning phrase before traditional supervised learning
• Original goal: provide better initialization points for supervised training
Language modeling: Predict a part of a given language piece (target) using the rest (context)
• A classic task in NLP et al. to model human usage of natural language