You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for providing the amazon repo. But I have a question regarding the loss function in tutorial /0_basic.py. The loss function is loss_func = torch.nn.CrossEntropyLoss() which applies the softmax and log internally. And when we use PromptForClassification and from openprompt.prompts import ManualVerbalizer as model and verbalizer, the default setting for the output of logits = prompt_model(inputs) would go through the softmax and log : https://github.com/thunlp/OpenPrompt/blob/main/openprompt/prompts/manual_verbalizer.py#L147. Then the logits are input the the loss_func which would go through the softmax and log again. I guess there is something wrong? I doesn't make sense the logits go through softmax and log twice. Thanks.
The text was updated successfully, but these errors were encountered:
hello,
Thanks for providing the amazon repo. But I have a question regarding the loss function in
tutorial /0_basic.py
. The loss function isloss_func = torch.nn.CrossEntropyLoss()
which applies the softmax and log internally. And when we usePromptForClassification
andfrom openprompt.prompts import ManualVerbalizer
as model and verbalizer, the default setting for the output oflogits = prompt_model(inputs)
would go through the softmax and log : https://github.com/thunlp/OpenPrompt/blob/main/openprompt/prompts/manual_verbalizer.py#L147. Then the logits are input the theloss_func
which would go through the softmax and log again. I guess there is something wrong? I doesn't make sense the logits go through softmax and log twice. Thanks.The text was updated successfully, but these errors were encountered: