When evaluating a detection algorithm, you can choose between a "pixel" approach or an "object approach". You can also consider "ROC curves" or "PR curves". These slides attempt to summarize the specific characteristics of these products
Nanoparticles for the Treatment of Alzheimer’s Disease_102718.pptx
Performance evaluation of object detection algorithms in remote sensing images
1. HOW TO CONDUCT A PERFORMANCE EVALUATION
OF DETECTION ALGORITHMS
ON REMOTE SENSING IMAGES?
ELISE COLIN KOENIGUER
2. A ground
truth can
be:
A detection result
can be:
a shapefile a raster file detection plots bounding boxes
…
A binary map detection plots
FIRSTYOU NEED : - A GROUND TRUTH
- A DETECTION RESULT
3. THENYOU NEEDTO DEFINE TRUE/FALSE POSITIVE,
TRUE/ FALSE NEGATIVE
Detection result = POSITIVES
TP:
True
Positive
FP:
False
Positive
Non-detection = NEGATIVES
TN:
True
Négative
FN :
False
Negative
If you consider groups of pixels
If you consider « objects »
We cannot define
a « background »
number
4. THEN DEFINE PRECISION
RECALL = PROBABILITY OF DETECTION
PROBABILITY OF FALSE ALARMS
Target Detected Non detected
Non Here
(Ground Truth)
FP
= False Positive
TNTrue negative
Present (Ground
Truth)
TP
=True positive
FN False negative
PRECISION
How many, among detected pixels/objetcs, are relevant?
RECALL = Probability of Detection (PD)
= sensitivity = True Positive Rate
How many ipxels/objects are selected among relevant ones?
Probability of False Alarms (PFA) = 1 – Specificity
Cannot be computed with « objects »
𝑇𝑃
𝑇𝑃 + 𝐹𝑃
𝑇𝑃
𝑇𝑃 + 𝐹𝑁
𝐹𝑃
𝑇𝑁 + 𝐹𝑃
5. FINALLY… PLOT THE CURVES. ROC OR PR?
Advantage: takes into account the scarcity of the
target (quantity of the "background")
Advantage: generalizes to
each object class
PRECISION
RECALL
PD
Probability of detection
PFA = 1 - Specificity
Receiver Operating
Characteristic curve,
or ROC curve
Precision / Recall
or PR curve
6. TO SUMMARIZE
Each case-study calculates true negatives /positives in its own way, in terms of:
shape overlap
distance between bounding boxes,
number of pixels detected, etc.
A ROC curve includes the “true negatives” (or the "background" surface).
A Precision/Recall don’t.
In the "object" approach, defining a true negative doesn’t make sense.
It is therefore not possible to calculate a ROC curve with an object approach.
For classification, precision-recall curves can be generalized to each class
7. TO GO FURTHER
Davis, J., & Goadrich, M. (2006, June).
The relationship between Precision-Recall and ROC curves.
In Proceedings of the 23rd international conference on Machine learning (pp. 233-240).
Receiver Operator Characteristic (ROC) curves are commonly used for binary decision (detection)
problems in machine learning.
When dealing with highly skewed datasets, when you care more about the rare case (classification),
Precision-Recall (PR) curves are informative on algorithm's performance.
A curve dominates in ROC space if and only if it dominates in PR space.
It is incorrect to linearly interpolate between points in Precision-Recall space.
Algorithms that optimize the area under the ROC curve are not guaranteed to optimize the area under the
PR curve.