Classification models helps to predict discrete variables.
Examples: Disease Prediction, Fraud Detection, Employee churn
-0.4
-0.2
0
0.2
0.4
0.6
0.8
1
1.2
1.4
Linear regression will not be a good solution for
these problems.
We may search for something different.
Sigmoid function exists between 0 to 1.
It is especially used for models where
we have to predict the probability as
an output.
Solution for y
0
0.2
0.4
0.6
0.8
1
1.2
Output of sigmoid
(Sigmoid curve (S curve)
Same will be converted to label
based on threshold value.
Confusion matrix is an array that
indicates how many predictions in each
class got correct/ incorrect.
Yes (1) – Positive
No (0) – Negative
Correct Prediction – True
Wrong Prediction - False
TP + TN
Accuracy =
TP + FP + TN + FN
Accuracy Score (ratio of correct predictions) Precision(Correct Yes predictions out of predicted Yes)
TP
Precision =
TP + FP
Recall (Correct Yes predictions out of total Yes)
Or Sensitivity
TP
Recall =
TP + FN
Specificity(Correct No predictions out of predicted No)
TN
Specificity =
TN + FP
2 * Precision * Recall
F1 Score =
Precision + Recall
F1 Score
Logistic regression classification

Logistic regression classification

  • 2.
    Classification models helpsto predict discrete variables. Examples: Disease Prediction, Fraud Detection, Employee churn -0.4 -0.2 0 0.2 0.4 0.6 0.8 1 1.2 1.4 Linear regression will not be a good solution for these problems. We may search for something different.
  • 3.
    Sigmoid function existsbetween 0 to 1. It is especially used for models where we have to predict the probability as an output. Solution for y
  • 4.
    0 0.2 0.4 0.6 0.8 1 1.2 Output of sigmoid (Sigmoidcurve (S curve) Same will be converted to label based on threshold value.
  • 5.
    Confusion matrix isan array that indicates how many predictions in each class got correct/ incorrect. Yes (1) – Positive No (0) – Negative Correct Prediction – True Wrong Prediction - False
  • 6.
    TP + TN Accuracy= TP + FP + TN + FN Accuracy Score (ratio of correct predictions) Precision(Correct Yes predictions out of predicted Yes) TP Precision = TP + FP Recall (Correct Yes predictions out of total Yes) Or Sensitivity TP Recall = TP + FN Specificity(Correct No predictions out of predicted No) TN Specificity = TN + FP 2 * Precision * Recall F1 Score = Precision + Recall F1 Score