This document discusses decision trees, including: - Decision trees are tree-like graphs used for decision support modeling. There are classification trees for categorical variables and regression trees for continuous variables. - Nodes in a decision tree include the root node, internal decision nodes, and leaf nodes. The depth is the longest path from root to leaf. - Common algorithms for building decision trees include ID3, C4.5, C5.0, and CART. They use criteria like information gain, Gini index, and classification error to determine the best attributes to split on. - Decision trees can be prone to overfitting without pruning. Strategies like prepruning and postpruning are used to prevent over