Naïve Bayes is a probabilistic classifier that applies Bayes' theorem with a strong (naive) independence assumption. It estimates the probability of a class given an instance by calculating the product of the prior probability of the class and the conditional probability of each attribute value given the class. This allows easy and fast training by separately estimating attribute probabilities in each class from training data. Classification is done by looking up these probabilities to find the class with the highest posterior probability for a new instance. Naïve Bayes works surprisingly well in practice despite its independence assumption often being violated.