We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
在最后一步介绍注意力模型的解码器 RNN时提到的两个输入 “一个word embedding 向量,和一个初始化好的解码器 hidden state,图中是$h_{init}$。”,这两个输入是来自哪里呢?word embedding 向量是通过哪个文本做的embedding
The text was updated successfully, but these errors were encountered:
我觉得word embedding是<bos>的embedding,标识序列的开始
Sorry, something went wrong.
No branches or pull requests
在最后一步介绍注意力模型的解码器 RNN时提到的两个输入 “一个word embedding 向量,和一个初始化好的解码器 hidden state,图中是$h_{init}$。”,这两个输入是来自哪里呢?word embedding 向量是通过哪个文本做的embedding
The text was updated successfully, but these errors were encountered: