Understanding KNN and Logistic
Regression
With Examples and Iris Dataset Code
Explanation
By: [Your Name]
What is KNN?
• K-Nearest Neighbors (KNN) is a simple, non-parametric algorithm.
• Instance-based learning: no explicit training phase.
• Classification based on proximity in feature space.
How KNN Works
• 1. Choose a value for K (number of neighbors).
• 2. Calculate distance (e.g., Euclidean) from new point to training points.
• 3. Find K nearest neighbors.
• 4. Assign the most common class among the neighbors.
Example of KNN
• New fruit to classify: Shape = Round, Color = Red.
• Training examples: Apples (Round, Red), Bananas (Long, Yellow).
• Most neighbors suggest Apple -> Classified as Apple.
What is Logistic Regression?
• A statistical model for binary or multi-class classification.
• Estimates the probability of a data point belonging to a class.
• Uses a logistic (sigmoid) function to squeeze output between 0 and 1.
How Logistic Regression Works
• 1. Compute weighted sum of inputs.
• 2. Apply sigmoid activation function.
• 3. Predict class based on a threshold (e.g., 0.5).
Example of Logistic Regression
• Predict whether a student passes based on study hours.
• More study hours -> Higher probability of passing.
• Probability output: e.g., 0.8 -> Pass.
Logistic Regression - Iris Dataset
Code (Part 1)
• Load Iris dataset using sklearn.
• Split data into features (X) and labels (y).
• Prepare training and testing sets.
Logistic Regression - Iris Dataset
Code (Part 2)
• Initialize LogisticRegression model.
• Train model with training data.
• Predict outcomes on test data.
• Evaluate with accuracy score.
Code Walkthrough
• 1. Load data.
• 2. Preprocess: train-test split.
• 3. Train model.
• 4. Test model.
• 5. Measure and print accuracy.
Summary
• KNN: Distance-based, simple, effective.
• Logistic Regression: Probability-based, robust for classification.
• Both are key tools in machine learning!
Thank You!
• Any Questions?
• Happy Learning!

KNN_Logistic_Regression_Presentation_Styled.pptx

  • 1.
    Understanding KNN andLogistic Regression With Examples and Iris Dataset Code Explanation By: [Your Name]
  • 2.
    What is KNN? •K-Nearest Neighbors (KNN) is a simple, non-parametric algorithm. • Instance-based learning: no explicit training phase. • Classification based on proximity in feature space.
  • 3.
    How KNN Works •1. Choose a value for K (number of neighbors). • 2. Calculate distance (e.g., Euclidean) from new point to training points. • 3. Find K nearest neighbors. • 4. Assign the most common class among the neighbors.
  • 4.
    Example of KNN •New fruit to classify: Shape = Round, Color = Red. • Training examples: Apples (Round, Red), Bananas (Long, Yellow). • Most neighbors suggest Apple -> Classified as Apple.
  • 5.
    What is LogisticRegression? • A statistical model for binary or multi-class classification. • Estimates the probability of a data point belonging to a class. • Uses a logistic (sigmoid) function to squeeze output between 0 and 1.
  • 6.
    How Logistic RegressionWorks • 1. Compute weighted sum of inputs. • 2. Apply sigmoid activation function. • 3. Predict class based on a threshold (e.g., 0.5).
  • 7.
    Example of LogisticRegression • Predict whether a student passes based on study hours. • More study hours -> Higher probability of passing. • Probability output: e.g., 0.8 -> Pass.
  • 8.
    Logistic Regression -Iris Dataset Code (Part 1) • Load Iris dataset using sklearn. • Split data into features (X) and labels (y). • Prepare training and testing sets.
  • 9.
    Logistic Regression -Iris Dataset Code (Part 2) • Initialize LogisticRegression model. • Train model with training data. • Predict outcomes on test data. • Evaluate with accuracy score.
  • 10.
    Code Walkthrough • 1.Load data. • 2. Preprocess: train-test split. • 3. Train model. • 4. Test model. • 5. Measure and print accuracy.
  • 11.
    Summary • KNN: Distance-based,simple, effective. • Logistic Regression: Probability-based, robust for classification. • Both are key tools in machine learning!
  • 12.
    Thank You! • AnyQuestions? • Happy Learning!