-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Problem with loss #3
Comments
@HeVLF Hi, have you normalized the data you entered? Is it cross-entropy loss or mutual information loss? Is the weight of the loss function set correctly? |
The situation that I described was with ABIDE and the default setting by you, so with both cross entropy and mutual information, with the weight being equal to 0.1. After I tried with only cross entropy and the results were what one would expect. But I still couldn’t find the error using the loss as described in your paper. |
The output should be normalized |
您好,请问我在运行的时候为什么train spe: 0.00000 |
Hello, thank you for sharing your work! I am trying to run your code but I'm encountering some errors with computing the loss, mainly due to the majority of the epochs having a loss equal to "inf" or even "nan". So I am not sure if I changed something without noticing. But would love to hear from you or someone that has faced the same issue.
Best.
The text was updated successfully, but these errors were encountered: