This document summarizes the evolution of convolutional neural networks (CNNs) from LeNet to ResNet. It discusses key CNN architectures like AlexNet, VGGNet, GoogLeNet, and ResNet and the techniques they introduced such as ReLU, dropout, batch normalization, and residual connections. These techniques helped reduce overfitting and allowed training of much deeper networks, leading to substantially improved accuracy on the ImageNet challenge over time, from AlexNet's top-5 error of 15.3% in 2012 to ResNet's 3.57% in 2015.