Random Forest, Boosted Trees and other ensemble learning methods build multiple models to improve predictive performance over single models. They combine "weak learners" like decision trees into a "strong learner". Random Forest adds randomness by selecting a random subset of features at each split. Boosting trains trees sequentially on weighted data from previous trees. Both reduce variance compared to bagging. Random Forest often outperforms Boosting while being faster to train. Neural networks can also be viewed as an ensemble method by combining simple units.