The document discusses optimization methods for sparse coding and dictionary learning problems. It covers greedy pursuit methods like matching pursuit and orthogonal matching pursuit that iteratively select atoms to approximate the signal. Homotopy methods like LARS that follow the piecewise linear regularization path of lasso are also discussed. The document provides optimality conditions for lasso using directional derivatives and describes how coordinate descent can be used to optimize separable nonsmooth objectives like lasso. Code examples using the SPAMS toolbox are provided.