-
Notifications
You must be signed in to change notification settings - Fork 264
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
inaccurate emotion classification #34
Comments
@octopousprime Thanks for reaching out and submitting this. It sounds related to #33. Will see what we can do about this, but it sounds like a bug. |
Thanks! I appreciate if you can inform me when would you expect to investigate and resolve this bug; as I have an ongoing research that depends on the accuracy of the results. |
This doesn't look so similar to issue #33, because that issue describes inaccurate results. @octopousprime's issue describes a completely wrong result - 100% However, this issue has been stale a while. I'm going to close it for now as we are trying to get a handle on the most crucial issues. @octopousprime please can you re-open this if you are still having a problem and we'll find someone to take a look at it - thanks |
I'm having the same issue, @microcosm. If I just change
Then I get the following output:
Therefore it seems this model can only predict |
Leaving a comment to say I have the same problem. Trained a model on FERPlus dataset using Conv_dropout_model. When predicting using fer_model_example, all images return 100% on one of the emotions. E.g anger -100% |
Hi
I used the pre-trained 7 emotions model but it gives me the same result (anger = 100%) irrespective of the input image provided.
Can someone please provide me with an explanation of why this is happening?
Thanks in advance.
The text was updated successfully, but these errors were encountered: