This document discusses optimal control problems for stochastic sequential machines (SSMs). It begins by introducing SSMs and defining their components. It then formulates the optimal control problem for processes represented by SSMs, proving the principle of optimality. Using dynamic programming, it derives the Bellman equation to find the optimal control solution. In conclusions, it shows that the Bellman equation and principle of optimality apply to obtaining the optimal control for processes modeled as SSMs.