This document provides an overview of preferred natural language processing infrastructure and techniques. It discusses recurrent neural networks, statistical machine translation tools like GIZA++ and Moses, voice recognition systems from NICT and NTT, topic modeling using latent Dirichlet allocation, dependency parsing with minimum spanning trees, and recursive neural networks for natural language tasks. References are provided for several papers on these methods.