Chapter 12 - Bug Using Softmax Output on CrossEntropyLoss #222
liereynaldo
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Since CrossEntropyLoss on Pytorch expecting Logit input is incompatible with Softmax outputting probability, the loss will be Loss=−log(Softmax(Softmax(logits))), so to resolve this either:
Beta Was this translation helpful? Give feedback.
All reactions