This document summarizes dimensionality reduction techniques principal component analysis (PCA) and linear discriminant analysis (LDA). PCA seeks to reduce dimensionality while retaining as much variation in the data as possible. It finds the directions with the most variance by using the eigenvectors of the covariance matrix. LDA performs dimensionality reduction to best separate classes by maximizing between-class scatter while minimizing within-class scatter. It finds discriminatory directions by solving a generalized eigenvalue problem involving the between-class and within-class scatter matrices. Both techniques are useful for applications like face recognition by projecting high-dimensional images onto a lower-dimensional discriminative space.