Random forest is an ensemble classifier that consists of many decision trees. It combines bagging and random selection of features to construct trees and outputs the class that is the mode of the classes from individual trees. Each tree is constructed using a bootstrap sample of training data and randomly selecting features at each node to split on. New samples are classified by pushing them through each tree and taking the average vote. Random forest provides accurate classification, handles thousands of variables, estimates variable importance, and generates unbiased error estimates as trees are constructed.