This document discusses classical optimization theory and methods for finding extrema of functions. It defines local and global minima/maxima, and presents necessary and sufficient conditions for identifying extrema. Specifically, it states that a necessary condition is for the gradient to be zero at a point, while sufficient conditions involve the Hessian matrix being positive/negative definite. It then introduces the Newton-Raphson method for numerically solving the necessary condition of setting the gradient to zero. An example demonstrates applying Newton-Raphson to find an extremum of a sample function.