What is K-Nearest Neighbors?
K-Nearest Neighbors is a supervised
machine learning algorithm
composed of a specified labeled
dataset accommodating training sets
(x, y) and would like to represent the
correlation between x and y
1.
2.
3.
New test point is categorized by
greater number of votes of its
neighbors
The test point is allocated to the
category most available among
its k nearest neighbors
If k=1, then new point is
allocated to the category of its
only nearest neighbor
Objective of KNN Algorithm
KNN finds out a function
𝒉: 𝒙 → 𝒚 hence for a new
test point 𝒙, 𝐡(𝐱)can
assertively deduce the
equivalent output y
The feat of KNN classifier is
mostly hinges on the
distance metric employed to
recognize the k nearest
neighbors of a test point
1 Euclidean Distance
2 Manhattan Distance
3 Minkowski Distance
4 Hamming Distance
x
y
𝑥1 𝑥2
𝑦1
𝑦2
𝑃1 𝑥1, 𝑦1
𝑃2 𝑥2, 𝑦2
Euclidean Distance (d) = 𝑥2 − 𝑥1
2 + 𝑦2 − 𝑦1
2
The most commonly employed distance is the Euclidean Distance !!
Choose no. of Nearest Neighbors (k)
Calculates d between x & k
Select a Distance Metric (d)
Counts data points for classes
Allocates x to the class of max neighbors
HOW
KNN
ALGORITHM
WORKS
KNN
Classifier
+++
If k = 3 the test sample (green
polygon) is allocated to the
category of blue circle
If k = 5 then the test sample is
assigned to the category of
yellow square
We will define a KNN
Classifier to classify three
types of Flowers from
the Iris Data Set !
Let’s demonstrate k
nearest neighbors in
google colab !!!

Knn demonstration

  • 1.
    What is K-NearestNeighbors? K-Nearest Neighbors is a supervised machine learning algorithm composed of a specified labeled dataset accommodating training sets (x, y) and would like to represent the correlation between x and y
  • 2.
    1. 2. 3. New test pointis categorized by greater number of votes of its neighbors The test point is allocated to the category most available among its k nearest neighbors If k=1, then new point is allocated to the category of its only nearest neighbor Objective of KNN Algorithm KNN finds out a function 𝒉: 𝒙 → 𝒚 hence for a new test point 𝒙, 𝐡(𝐱)can assertively deduce the equivalent output y
  • 3.
    The feat ofKNN classifier is mostly hinges on the distance metric employed to recognize the k nearest neighbors of a test point 1 Euclidean Distance 2 Manhattan Distance 3 Minkowski Distance 4 Hamming Distance
  • 4.
    x y 𝑥1 𝑥2 𝑦1 𝑦2 𝑃1 𝑥1,𝑦1 𝑃2 𝑥2, 𝑦2 Euclidean Distance (d) = 𝑥2 − 𝑥1 2 + 𝑦2 − 𝑦1 2 The most commonly employed distance is the Euclidean Distance !!
  • 5.
    Choose no. ofNearest Neighbors (k) Calculates d between x & k Select a Distance Metric (d) Counts data points for classes Allocates x to the class of max neighbors HOW KNN ALGORITHM WORKS KNN Classifier
  • 6.
    +++ If k =3 the test sample (green polygon) is allocated to the category of blue circle If k = 5 then the test sample is assigned to the category of yellow square
  • 7.
    We will definea KNN Classifier to classify three types of Flowers from the Iris Data Set ! Let’s demonstrate k nearest neighbors in google colab !!!