Ensemble methods are machine learning techniques that combine multiple learning algorithms to obtain better predictive performance than could be obtained from any of the constituent learning algorithms alone. Bagging creates multiple bootstrap samples of the data and trains a model on each sample, then averages the predictions to reduce variance. Boosting converts weak learners into strong ones by iteratively reweighting samples and focusing on incorrectly predicted instances. It aims to reduce bias and variance.