This document explores optimal control theory through the lens of variational principles in mechanics, highlighting philosophical origins and connections between the two fields. It discusses the evolution of optimal control from historical problems like the brachistochrone to modern applications and numerical methods, emphasizing the Hamilton-Jacobi-Bellman equation and its developments. The report aims to provide an understanding of optimal control principles for those familiar with analytical mechanics, presenting both foundational and advanced concepts.