Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

sigmoid is missing in QuestionAttnGRU.py #14

Open
keithhans opened this issue Oct 7, 2017 · 5 comments
Open

sigmoid is missing in QuestionAttnGRU.py #14

keithhans opened this issue Oct 7, 2017 · 5 comments
Assignees
Labels

Comments

@keithhans
Copy link

According to equation 6 in the paper, there should be a sigmoid on K.dot(GRU_inputs, W_g1).

@mahnerak
Copy link
Member

mahnerak commented Oct 7, 2017

Completely agree with you. Will be fixed soon.

@keithhans
Copy link
Author

Just noticed it is also missing in SelfAttnGRU.py

@mahnerak
Copy link
Member

mahnerak commented Oct 7, 2017

We also noticed that.
Sigmoid disappeared after we switched from using Dense layers (with sigmoid activation) to SharedWeight layers.
Now I'm trying to repeat training process. We hope can get better scores after fixing this issue.

@mahnerak
Copy link
Member

Pull request is merged.
Not closing the issue until we compare new scores.
@MartinXPN please rerun the instruction steps from README.md

@MartinXPN
Copy link
Member

After repeating all the steps in readme the performance didn't change much. Accuracy reached up to 60% (a bit less)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants