The document presents a comparative study of algorithms for nonlinear optimization. It evaluates several nonlinear optimization algorithms (conjugate gradient methods, quasi-Newton methods, Powell's method) on standard test functions (Booth, Himmelblau, Beale) from different initial points and step sizes. For the Booth function, all algorithms converge in 2 steps. For Himmelblau, Polak-Ribiere and BFGS methods perform best. For Beale, conjugate gradient methods converge for some initial points but not all, while Powell's method converges from more points.