Time complexity analysis specifies how the running time of an algorithm depends on the size of its input. It is determined by identifying the basic operation of the algorithm and counting how many times it is executed as the input size increases. The time complexity of an algorithm can be expressed as a function T(n) relating the input size n to the running time. Common time complexities include constant O(1), linear O(n), logarithmic O(log n), and quadratic O(n^2). The order of growth indicates how fast the running time grows as n increases and is used to compare the efficiency of algorithms.