This document discusses several basic optimization algorithms: 1) Steepest descent chooses the search direction to maximize decrease in the function at each step. 2) Newton's method approximates the function as a quadratic and finds the stationary point by setting the gradient of this approximation to zero. 3) Conjugate gradient methods choose subsequent search directions to be conjugate to improve convergence for non-quadratic functions. The directions are updated to be conjugate to previous directions.