AI Club
AI Club
Classification
Accuracy and Confusion Matrix
How do you measure how good a Model is?
• Create a “Validation Dataset” –
a part of the training data put
aside to test
• Validation dataset has the
actual (correct answer)
• After model is trained, use it to
predict the answer and compare
prediction to the actual answer
in the validation dataset
How to Measure Classification
• In classification, answer is
precisely right or wrong. Right if it
matches label, wrong otherwise
(no matter how many categories
there are)
• Accuracy
• Correct Answers*100/Total answers
• Closer to 100% the better
• We will see more metrics in later
projects
Number of
Countries
Visited
Number of
Years in
School
Height
(Feet)
Original
Label
Who am
I?
What the
model
predicted
15 13 5.0 Adult Adult
2 3 3.5 Child Child
7 4 4.0 Child Child
1 5 7.0 Adult Child
3 Correct * 100/ 4 Total = 75%
What is the
Accuracy?
True Label Predicted Value
Adult Adult
Adult Child
Child Adult
Adult Child
Child Child
Child Child
Child Adult
Adult Adult
Child Child
Adult Adult
What is the
Accuracy?
60%
True Label Predicted Value
Adult Adult
Adult Child
Child Adult
Adult Child
Child Child
Child Child
Child Adult
Adult Adult
Child Child
Adult Adult
What is the
Accuracy?
True Label Predicted Value
Child Child
Child Child
Child Child
Adult Child
Child Child
Child Child
Child Child
Adult Child
Child Child
Child Child
True Label Predicted Value
Child Child
Child Child
Child Child
Adult Child
Child Child
Child Child
Child Child
Adult Child
Child Child
Child Child
What is the
Accuracy?
80%
What did you notice?
•In table 2, a dumb algorithm that predicts only one value has
a high accuracy.
What did you notice?
•In table 2, a dumb algorithm that predicts only one value has
a high accuracy.
•Accuracy value is not enough to measure the performance of
an algorithm
•We need other ways to understand if an algorithm is good or
bad
Create a confusion matrix Fill the below with the counts
That’s your confusion matrix. It tells
you how confused your algorithm is!
Child
Adult
Child
Adult
PREDICTIONS
LABELS
True Label Predicted Value
Adult Adult
Adult Child
Child Adult
Adult Child
Child Child
Child Child
Child Adult
Adult Adult
Child Child
Adult Adult
Create a confusion matrix Fill the below with the counts
How many times did the algorithm
predict Adult and was right about it?
Child
Adult
Child
Adult
PREDICTIONS
LABELS
True Label Predicted Value
Adult Adult
Adult Child
Child Adult
Adult Child
Child Child
Child Child
Child Adult
Adult Adult
Child Child
Adult Adult
Create a confusion matrix Fill the below with the counts
That’s your confusion matrix. It tells
you how confused your algorithm is!
Child
Adult
Child
Adult
PREDICTIONS
LABELS
True Label Predicted Value
Adult Adult
Adult Child
Child Adult
Adult Child
Child Child
Child Child
Child Adult
Adult Adult
Child Child
Adult Adult
Create a confusion matrix Fill the below with the counts
That’s your confusion matrix. It tells
you how confused your algorithm is!
Child
Adult
Child
Adult
PREDICTIONS
LABELS
True Label Predicted Value
Adult Adult
Adult Child
Child Adult
Adult Child
Child Child
Child Child
Child Adult
Adult Adult
Child Child
Adult Adult
3
Create a confusion matrix Fill the below with the counts
How many times did the algorithm
predict Child and was right about it?
Child
Adult
Child
Adult
PREDICTIONS
LABELS
True Label Predicted Value
Adult Adult
Adult Child
Child Adult
Adult Child
Child Child
Child Child
Child Adult
Adult Adult
Child Child
Adult Adult
3
Create a confusion matrix Fill the below with the counts
How many times did the algorithm
predict Child and was right about it?
Child
Adult
Child
Adult
PREDICTIONS
LABELS
True Label Predicted Value
Adult Adult
Adult Child
Child Adult
Adult Child
Child Child
Child Child
Child Adult
Adult Adult
Child Child
Adult Adult
3
3
Create a confusion matrix Fill the below with the counts
How many times did the algorithm
predict Child and was wrong about it?
Child
Adult
Child
Adult
PREDICTIONS
LABELS
True Label Predicted Value
Adult Adult
Adult Child
Child Adult
Adult Child
Child Child
Child Child
Child Adult
Adult Adult
Child Child
Adult Adult
3
3
Create a confusion matrix Fill the below with the counts
How many times did the algorithm
predict Child and was wrong about it?
Child
Adult
Child
Adult
PREDICTIONS
LABELS
True Label Predicted Value
Adult Adult
Adult Child
Child Adult
Adult Child
Child Child
Child Child
Child Adult
Adult Adult
Child Child
Adult Adult
3
3
2
Create a confusion matrix Fill the below with the counts
How many times did the algorithm
predict Adult and was wrong about it?
Child
Adult
Child
Adult
PREDICTIONS
LABELS
True Label Predicted Value
Adult Adult
Adult Child
Child Adult
Adult Child
Child Child
Child Child
Child Adult
Adult Adult
Child Child
Adult Adult
3
3
2
Create a confusion matrix Fill the below with the counts
How many times did the algorithm
predict Adult and was wrong about it?
Child
Adult
Child
Adult
PREDICTIONS
LABELS
True Label Predicted Value
Adult Adult
Adult Child
Child Adult
Adult Child
Child Child
Child Child
Child Adult
Adult Adult
Child Child
Adult Adult
3
3
2
2
Create a confusion matrix
That’s your confusion matrix. It tells
you how confused your algorithm is!
Child
Adult
Child
Adult
PREDICTIONS
LABELS
3
3
2
2
•How can you calculate Accuracy
from your confusion Matrix?
Create a confusion matrix
That’s your confusion matrix. It tells
you how confused your algorithm is!
Child
Adult
Child
Adult
PREDICTIONS
LABELS
3
3
2
2
•Accuracy
= sum(Diagonal Values)x100/ Total
= (3+3)x100/(3+3+2+2)
= 60%
•How can you calculate Accuracy
from your confusion Matrix?
Exercise - 5
Calculate accuracy using the
confusion matrix
Child
Adult
Child
Adult
PREDICTIONS
LABELS
3
3
2
2
Child
Adult
Child
Adult
PREDICTIONS
LABELS
0
8
0
2
THANK YOU
https://aiclub.world
info@pyxeda.ai

Classification accuracy confusion_matrix_middleschool

  • 1.
  • 2.
  • 3.
    How do youmeasure how good a Model is? • Create a “Validation Dataset” – a part of the training data put aside to test • Validation dataset has the actual (correct answer) • After model is trained, use it to predict the answer and compare prediction to the actual answer in the validation dataset
  • 4.
    How to MeasureClassification • In classification, answer is precisely right or wrong. Right if it matches label, wrong otherwise (no matter how many categories there are) • Accuracy • Correct Answers*100/Total answers • Closer to 100% the better • We will see more metrics in later projects Number of Countries Visited Number of Years in School Height (Feet) Original Label Who am I? What the model predicted 15 13 5.0 Adult Adult 2 3 3.5 Child Child 7 4 4.0 Child Child 1 5 7.0 Adult Child 3 Correct * 100/ 4 Total = 75%
  • 5.
    What is the Accuracy? TrueLabel Predicted Value Adult Adult Adult Child Child Adult Adult Child Child Child Child Child Child Adult Adult Adult Child Child Adult Adult
  • 6.
    What is the Accuracy? 60% TrueLabel Predicted Value Adult Adult Adult Child Child Adult Adult Child Child Child Child Child Child Adult Adult Adult Child Child Adult Adult
  • 7.
    What is the Accuracy? TrueLabel Predicted Value Child Child Child Child Child Child Adult Child Child Child Child Child Child Child Adult Child Child Child Child Child
  • 8.
    True Label PredictedValue Child Child Child Child Child Child Adult Child Child Child Child Child Child Child Adult Child Child Child Child Child What is the Accuracy? 80%
  • 9.
    What did younotice? •In table 2, a dumb algorithm that predicts only one value has a high accuracy.
  • 10.
    What did younotice? •In table 2, a dumb algorithm that predicts only one value has a high accuracy. •Accuracy value is not enough to measure the performance of an algorithm •We need other ways to understand if an algorithm is good or bad
  • 11.
    Create a confusionmatrix Fill the below with the counts That’s your confusion matrix. It tells you how confused your algorithm is! Child Adult Child Adult PREDICTIONS LABELS True Label Predicted Value Adult Adult Adult Child Child Adult Adult Child Child Child Child Child Child Adult Adult Adult Child Child Adult Adult
  • 12.
    Create a confusionmatrix Fill the below with the counts How many times did the algorithm predict Adult and was right about it? Child Adult Child Adult PREDICTIONS LABELS True Label Predicted Value Adult Adult Adult Child Child Adult Adult Child Child Child Child Child Child Adult Adult Adult Child Child Adult Adult
  • 13.
    Create a confusionmatrix Fill the below with the counts That’s your confusion matrix. It tells you how confused your algorithm is! Child Adult Child Adult PREDICTIONS LABELS True Label Predicted Value Adult Adult Adult Child Child Adult Adult Child Child Child Child Child Child Adult Adult Adult Child Child Adult Adult
  • 14.
    Create a confusionmatrix Fill the below with the counts That’s your confusion matrix. It tells you how confused your algorithm is! Child Adult Child Adult PREDICTIONS LABELS True Label Predicted Value Adult Adult Adult Child Child Adult Adult Child Child Child Child Child Child Adult Adult Adult Child Child Adult Adult 3
  • 15.
    Create a confusionmatrix Fill the below with the counts How many times did the algorithm predict Child and was right about it? Child Adult Child Adult PREDICTIONS LABELS True Label Predicted Value Adult Adult Adult Child Child Adult Adult Child Child Child Child Child Child Adult Adult Adult Child Child Adult Adult 3
  • 16.
    Create a confusionmatrix Fill the below with the counts How many times did the algorithm predict Child and was right about it? Child Adult Child Adult PREDICTIONS LABELS True Label Predicted Value Adult Adult Adult Child Child Adult Adult Child Child Child Child Child Child Adult Adult Adult Child Child Adult Adult 3 3
  • 17.
    Create a confusionmatrix Fill the below with the counts How many times did the algorithm predict Child and was wrong about it? Child Adult Child Adult PREDICTIONS LABELS True Label Predicted Value Adult Adult Adult Child Child Adult Adult Child Child Child Child Child Child Adult Adult Adult Child Child Adult Adult 3 3
  • 18.
    Create a confusionmatrix Fill the below with the counts How many times did the algorithm predict Child and was wrong about it? Child Adult Child Adult PREDICTIONS LABELS True Label Predicted Value Adult Adult Adult Child Child Adult Adult Child Child Child Child Child Child Adult Adult Adult Child Child Adult Adult 3 3 2
  • 19.
    Create a confusionmatrix Fill the below with the counts How many times did the algorithm predict Adult and was wrong about it? Child Adult Child Adult PREDICTIONS LABELS True Label Predicted Value Adult Adult Adult Child Child Adult Adult Child Child Child Child Child Child Adult Adult Adult Child Child Adult Adult 3 3 2
  • 20.
    Create a confusionmatrix Fill the below with the counts How many times did the algorithm predict Adult and was wrong about it? Child Adult Child Adult PREDICTIONS LABELS True Label Predicted Value Adult Adult Adult Child Child Adult Adult Child Child Child Child Child Child Adult Adult Adult Child Child Adult Adult 3 3 2 2
  • 21.
    Create a confusionmatrix That’s your confusion matrix. It tells you how confused your algorithm is! Child Adult Child Adult PREDICTIONS LABELS 3 3 2 2 •How can you calculate Accuracy from your confusion Matrix?
  • 22.
    Create a confusionmatrix That’s your confusion matrix. It tells you how confused your algorithm is! Child Adult Child Adult PREDICTIONS LABELS 3 3 2 2 •Accuracy = sum(Diagonal Values)x100/ Total = (3+3)x100/(3+3+2+2) = 60% •How can you calculate Accuracy from your confusion Matrix?
  • 23.
    Exercise - 5 Calculateaccuracy using the confusion matrix Child Adult Child Adult PREDICTIONS LABELS 3 3 2 2 Child Adult Child Adult PREDICTIONS LABELS 0 8 0 2
  • 24.