Is COCO Average Precision (AP) metric averaged over different confidence thresholds? #5402
Unanswered
bartosz-grabowski
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
There doesn't seem to be a way to set the confidence threshold when using
COCOEvaluator
. Does that mean it is averaged for different confidence thresholds? The classdetectron2/detectron2/evaluation/coco_evaluation.py
Line 38 in c69939a
points to the following page which doesn't explain this, it only confirms that the AP is averaged over different IoU values.
Beta Was this translation helpful? Give feedback.
All reactions