This document discusses decision tree induction algorithms and their splitting criteria. It covers ID3, CART, and C4.5 algorithms. ID3 uses information gain and entropy for splitting criteria. CART uses the Gini index. The Gini index measures impurity at each node, with 0 being pure and 0.5 being completely impure. C4.5 improves on ID3 by using the gain ratio, which normalizes information gain to account for attributes with many values. The document provides examples of computing the Gini index and error for different distributions of data classes at nodes.