Understanding Confusion Matrix
A confusion matrix visualizes how your model's predictions compare to actual labels.
Actual
Negative
TN
850
FP
50
Actual
Positive
FN
30
TP
70
Click on any cell to learn more!
Classification Report
A comprehensive summary of your model's performance metrics.
precision recall f1-score support
Class 0 0.96 0.94 0.95 900
Class 1 0.58 0.70 0.64 100
accuracy 0.92 1000
macro avg 0.77 0.82 0.79 1000
weighted avg 0.93 0.92 0.92 1000
Key Insights:
• High support imbalance (900 vs 100)
• Better at predicting majority class
• Lower precision for minority class