why value in confusion matrix can decrease when evaluating (no normalization is applied) #5655
Unanswered
tabVersion
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I use multiple files as testsets, and I print confusion matrix when one file is completed. However, I find some value can decrease compared to the previous iteration. I checked the source code and I could not find any op can do so, which means the behavior is unexpected.
Here is the code:
And here are part of the output:
Beta Was this translation helpful? Give feedback.
All reactions