Precision-Recall Curve
Jump to navigation
Jump to search
A Precision-Recall Curve is a graphical plot of a PR function within a PR space.
- AKA: PR Graph.
- Context:
- It can be an input to a PR-Curve Analysis to illustrate the performance of a classification system (typically binary) as its discrimination threshold is varied.
- It can (typically) be a Binary Classifier Performance Metric.
- Example(s):
- Counter-Example(s):
- a Receiver Operating Characteristic Curve, for an AUC metric.
- a Confusion Matrix.
- a Accuracy Metric.
- an F1 Metric.
- See: ROC Convex Hull, Sensitivity (Tests), MAP.
References
2006
- (Davis & Goadrich, 2006) ⇒ Jesse Davis, and Mark Goadrich. (2006). “The Relationship Between Precision-Recall and ROC Curves.” In: Proceedings of the 23rd International Conference on Machine learning. ISBN:1-59593-383-2 doi:10.1145/1143844.1143874
- QUOTE: Receiver Operator Characteristic (ROC) curves are commonly used to present results for binary decision problems in machine learning. However, when dealing with highly skewed datasets, Precision-Recall (PR) curves give a more informative picture of an algorithm's performance. We show that a deep connection exists between ROC space and PR space, such that a curve dominates in ROC space if and only if it dominates in PR space.