The document provides an introduction and overview of logistic regression and the perceptron algorithm. It explains what logistic regression and the perceptron algorithm are, provides an example of using them to classify email as spam or ham, and describes the algorithms in detail. It discusses how the perceptron algorithm works to iteratively adjust the separating line to better classify points by moving the line towards misclassified points. It also introduces the concept of using gradient descent to minimize the log-loss error function in logistic regression classification.
Introduction to logistic regression and perceptron algorithm by Luis Serrano.
Overview of linear regression, logistic regression, and support vector machines along with perceptron algorithm videos.
Definition of logistic regression and perceptron’s application in classifying emails as spam or non-spam. Rules for classifying spam based on the appearances of the word 'buy' and spelling mistakes in emails.
Goal of classification using logistic regression to differentiate between ham and spam.Detailed steps of the perceptron algorithm, including steps to classify points and adjust the separating line.
Techniques to pivot, rotate, and translate the line used in perception algorithm classification.
Further detailed explanation of the perceptron trick for adjusting classification lines.
Correct vs incorrect classifications and the importance of evaluating decision boundaries in machine learning models.
Introduction to gradient descent, its application parallels with perceptron error minimization.
Introduction to log-loss function as a continuous error measurement in logistic regression and the calculation of errors.
Complete step-by-step logistic regression algorithm focusing on adjusting the line based on epoch and learning rate.
Wrap up of the presentation with relevant algorithms, appreciation, and channels to follow for more insights.
Perceptron
algorithm
Step 3: (repeat1000 times)
- Pick random point
- If point is correctly classified:
- Do nothing
- If point is incorrectly classified
- Move line towards point
Step 2: Pick a large number.
(number of repetitions, or epochs)
1000
Step 1: Start with a random line
with blue and red sides.
Step 4: Enjoy your line that
separates the data!
Get over
here!
I’m good
Algorithm
Step 3: Picka small number.
(learning rate)
0.01
Step 4: (repeat 1000 times)
- Pick random point
- If point is correctly classified:
- Do nothing
- If point is incorrectly classified
- Move line towards point
Step 2: Pick a large number.
(number of repetitions, or epochs)
1000
Step 1: Start with a random line
of equation ax + by + c = 0
Step 5: Enjoy your fitted line!
- If point is correctly classified
- Do nothing
- If point is incorrectly classified
- Add + 0.01 to a
- Add + 0.01 to b
- Add + 0.01 to c
Figure out!
Algorithm
Step 3: Picka small number.
(learning rate)
0.01
Step 4: (repeat 1000 times)
- Pick random point
- If point is correctly classified:
- Do nothing
- If point is incorrectly classified
- Move line towards point
Step 2: Pick a large number.
(number of repetitions, or epochs)
1000
Step 1: Start with a random line
of equation ax + by + c = 0
Step 5: Enjoy your fitted line!
- If point is correctly classified
- Do nothing
- If point is incorrectly classified
- Add + 0.01 to a
- Add + 0.01 to b
- Add + 0.01 to c
Figure out!
Perceptron
trick
Perceptron
algorithm Step 3:Pick a small number.
(learning rate)
0.01
Step 4: (repeat 1000 times)
- Pick random point (p,q)
- If point is correctly classified:
- Do nothing
- If point is incorrectly classified
- Move line towards point
Step 2: Pick a large number.
(number of repetitions, or epochs)
1000
Step 1: Start with a random line
of equation ax + by + c = 0
Step 5: Enjoy your line!
- If point is correctly classified
- Do nothing
- If point is blue, in the red area
- Subtract 0.01p to a
- Subtract 0.01q to b
- Subtract 0.01 to c
- If point is red, in the blue area
- Add 0.01p to a
- Add 0.01q to b
- Add 0.01 to c
Perceptron
algorithm Step 3:Pick a small number.
(learning rate)
0.01
Step 4: (repeat 1000 times)
- Pick random point (p,q)
- If point is correctly classified:
- Do nothing
- If point is incorrectly classified
- Move line towards point
Step 2: Pick a large number.
(number of repetitions, or epochs)
1000
Step 1: Start with a random line
of equation ax + by + c = 0
Step 5: Enjoy your line!
- If point is correctly classified
- Do nothing
- If point is blue, on red area
- Add 0.01 p to a
- Add 0.01 q to b
- Add 0.01 to c
- If point is, red on blue area
- Subtract 0.01 p to a
- Subtract 0.01 q to b
- Subtract 0.01 to c
- If point is correctly classified
- Do nothing
- If point is blue, and ap+bq+c > 0
- Subtract 0.01p to a
- Subtract 0.01q to b
- Subtract 0.01 to c
- If point is, red and ap+bq+c < 0
- Add 0.01p to a
- Add 0.01q to b
- Add 0.01 to c
Log-loss error
Large log-losserror Small log-loss error
Minimize using calculus (gradient descent)
Bad line Good line
Logistic regression algorithm
91.
Logistic regression
algorithm
Step 3:Pick a small number.
(learning rate)
0.01
Step 4: (repeat 1000 times)
- Pick random point (p,q)
- If point is correctly classified:
- Move line away from point
- If point is incorrectly classified
- Move line towards point
Step 2: Pick a large number.
(number of repetitions, or epochs)
1000
Step 1: Start with a random line
of equation ax + by + c = 0
Step 5: Enjoy your fitted line!
- Add 0.01(y - y)p to a
- Add 0.01(y - y)q to b
- Add 0.01(y - y) to c
^
^
^