Pytorch's autograd package automatically calculates gradients for tensor computations. It builds a computational graph to track operations on tensors and uses reverse-mode differentiation to calculate gradients by backpropagating from the leaves to the root of the graph. The Variable class stores tensor data and gradients, and references the function that created it via grad_fn. An example demonstrates setting requires_grad, tracking operations to build the graph, computing gradients with backward(), and printing the resulting gradients.