Skip to content
This repository has been archived by the owner on Apr 11, 2021. It is now read-only.

Trainable weights in AttentionLSTMWrapper #33

Open
saurabhmathur96 opened this issue Feb 28, 2017 · 0 comments
Open

Trainable weights in AttentionLSTMWrapper #33

saurabhmathur96 opened this issue Feb 28, 2017 · 0 comments

Comments

@saurabhmathur96
Copy link

This statement (https://github.com/codekansas/keras-language-modeling/blob/master/attention_lstm.py#L106) overwrites the trainable_weights of the inner LSTM layer. It should add to the weights of the LSTM Layer instead.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant