Confusion matrix

Negative Positive
Negative True Negative (TN) False Positive (FP)
Positive False Negative (FN) True Positive (TP)

Accuracy

Accuracy=TN+TPTN+FP+TP+FN=True ResultsTotal Results

Represents the number of correctly classified data instances over total number of data instances

Not a good metric when data set is unbalanced where both negative and positive classes have different number of data instances

Precision

Also known as the positive predictive value

Precision=TPTP+FP=True PositiveTotal Positives

Precision should ideally be 1, when TP=TP+FP where FP=0 i.e. no false positives.

Recall

Also known as the sensitivity or true positive rate

Recall=TPTP+FN=True PositiveTotal Actual Positives

Recall should ideally be 1, when TP=TP+FN where FN=0 i.e. no false negatives.

In an ideal classifier, precision and recall should be one. Hence, we should find a metric that takes into account both precision and recall.

F1 Score

F1 Score=2PrecisionRecallPrecision+Recall

F1 score only becomes high when both precision and recall are high.

Also known as the harmonic mean of precision and recall.

Specificity

Measures the ability to correctly identify negative instances. Also known as the true negative rate

Specificity=TNTN+FP=True NegativeTotal Actual Negatives