1. K-nearest neighbors (k-NN) is a simple machine learning algorithm that stores all training data and classifies new data based on the majority class of its k nearest neighbors. 2. It is a lazy, non-parametric algorithm that makes no assumptions about the distribution of the data. Learning involves storing training examples, while classification assigns a class based on similarity to stored examples. 3. k-NN has applications in areas like credit ratings, political science, handwriting recognition, and image recognition. It works by finding the k closest training examples in feature space and assigning the new example the majority class of those neighbors.