CS5720 - Week 3
Slide 58 of 60

Confusion Matrix and Classification Reports

Understanding Confusion Matrix

A confusion matrix visualizes how your model's predictions compare to actual labels.
Predicted Negative
Predicted Positive
Actual
Negative
TN
850
FP
50
Actual
Positive
FN
30
TP
70
Click on any cell to learn more!

Classification Report

A comprehensive summary of your model's performance metrics.
              precision    recall  f1-score   support

     Class 0       0.96      0.94      0.95       900
     Class 1       0.58      0.70      0.64       100

    accuracy                           0.92      1000
   macro avg       0.77      0.82      0.79      1000
weighted avg       0.93      0.92      0.92      1000
Key Insights:
• High support imbalance (900 vs 100)
• Better at predicting majority class
• Lower precision for minority class

Build Your Own Confusion Matrix

Make Predictions (Purple border = Your prediction)
Green = Actually Positive | Click to toggle prediction
Your Confusion Matrix
Pred -
Pred +
Act -
0
0
Act +
0
0
Accuracy: 0%
Precision: 0%
Recall: 0%
Prepared by Dr. Gorkem Kar