Precision and Recall

Precision and Recall are accuracy metrics for binary classification problems, where one class is associated with a 'positive' label, and another with a 'negative' label. They are based on: 'true positives', or TP (the number of samples correctly classified as 'positive'); 'false positives', or FP (the number of samples wrongly classified as 'positive'); 'false negatives', or FN (the number of samples wrongly classified as 'negative'). The formulas are: Precision = TP/(TP+FP); Recall = TP/(TP+FN). There is typically a trade off between Precision and Recall as parameters of the model are adjusted, and often times a Precision vs Recall curve is plotted for various values of a given parameter to evaluate the robustness of the model.
Related concepts:
Confusion MatrixROC Curve