This document introduces random forests and stochastic gradient boosting methods for classification and regression. It discusses how random forests grow many decision trees on bootstrap samples of data and average the results, keeping correlation between trees low. Stochastic gradient boosting iteratively adds small regression trees to minimize a loss function, reducing overfitting risk. The document provides examples of these methods achieving strong predictive performance on competitions and industry applications.