Running time analysis evaluates the efficiency of algorithms by analyzing how the number of steps required to solve a problem grows as the input size grows. It considers the order of growth or time complexity of an algorithm, often expressed using Big O notation, to determine how efficiently the algorithm will scale to larger problem sizes. The most efficient algorithms have the slowest growth rates as input size increases.