Confusion Matrix Metrics Calculator

Enter the four values of a binary confusion matrix to calculate classification performance metrics including Accuracy, Precision, Recall (Sensitivity), Specificity, F1 Score, MCC, and more.

Fill in the confusion matrix values above and click Calculate.

Formulas

Given: TP (True Positives), FP (False Positives), FN (False Negatives), TN (True Negatives), N = TP + FP + FN + TN

  • Accuracy = (TP + TN) / N
  • Precision (PPV) = TP / (TP + FP)
  • Recall / Sensitivity (TPR) = TP / (TP + FN)
  • Specificity (TNR) = TN / (TN + FP)
  • F1 Score = 2 × (Precision × Recall) / (Precision + Recall)
  • Balanced Accuracy = (Sensitivity + Specificity) / 2
  • MCC = (TP×TN − FP×FN) / √[(TP+FP)(TP+FN)(TN+FP)(TN+FN)]
  • False Positive Rate (FPR) = FP / (FP + TN)
  • False Negative Rate (FNR) = FN / (FN + TP)
  • Negative Predictive Value (NPV) = TN / (TN + FN)
  • False Discovery Rate (FDR) = FP / (FP + TP)
  • Informedness (Youden's J) = Sensitivity + Specificity − 1
  • Markedness = Precision + NPV − 1

Assumptions & References

  • Applies to binary classification problems only (positive vs. negative class).
  • All four inputs (TP, FP, FN, TN) must be non-negative integers.
  • Metrics that require division are reported as N/A when the denominator is zero (e.g., Precision is N/A when TP + FP = 0).
  • MCC ranges from −1 (total disagreement) to +1 (perfect prediction); 0 indicates random prediction. It is considered a balanced metric even for imbalanced datasets.
  • F1 Score is the harmonic mean of Precision and Recall; it is preferred over accuracy for imbalanced datasets.
  • Youden's J (Informedness) measures the probability that a prediction is informed versus chance.
  • References: Fawcett (2006) An introduction to ROC analysis, Pattern Recognition Letters; Powers (2011) Evaluation: From Precision, Recall and F-Measure to ROC, Informedness, Markedness & Correlation, JMLR.

In the network