This document discusses analytical models of learning curves with variable processing times. It introduces the concept of learning curves where operators get faster at tasks over time. It presents the basic log-linear model used to mathematically represent learning curves. The objectives are to calculate processing times for single-machine and two-machine systems using different models that account for variability in processing times, like exponential and hypo-exponential distributions. Equations are developed and examples are shown for determining average and variability of processing times. Further work is identified to extend the calculations to two-machine systems with both infinite and finite buffer sizes.