Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

model.n_classes assumed differently for binary classification. #419

Open
mojganmadadi opened this issue Jan 10, 2023 · 2 comments
Open

model.n_classes assumed differently for binary classification. #419

mojganmadadi opened this issue Jan 10, 2023 · 2 comments

Comments

@mojganmadadi
Copy link

Hi,

As far as I saw, for binary classification the model.n_classes should be 2, As input of the args parameters in here shows that. But throughout the code, in several places like loss function initialization and loss calculation it is asumed to be 1 for binary classification. Maybe I am wrong. Could you let me know what is the reason? Is it a bug or is there any reason behind it? thanks in advance.

@milesial
Copy link
Owner

Hi, it doesn't really matter if you use 1 or 2 classes for binary classification.
1 will use a single channel mask then sigmoid and threshold.
2 will use a 2-channels mask then argmax.
The losses for both are equivalent. CE with 2 classes is equivalent to BCE.

@gordon-n-stevenson
Copy link

Fixes a bug in the Dice loss that was n_classes not work

#446

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants