Support Vector Machines (SVMs) were first introduced in 1992 and became popular for handwritten digit recognition with an error rate of 1.1%, matching carefully designed neural networks. SVMs are now regarded as an important example of "kernel methods" in machine learning. SVMs find the optimal decision boundary, called the maximum margin classifier, that best separates classes by maximizing the margin or distance between the decision boundary and the closest data points of each class, known as support vectors.