Skip to content

Commit

Permalink
add comment
Browse files Browse the repository at this point in the history
  • Loading branch information
Dobiasd committed Dec 26, 2023
1 parent 1b6597e commit 9743a49
Showing 1 changed file with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions include/fdeep/layers/multi_head_attention_layer.hpp
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,8 @@ class multi_head_attention_layer : public layer
//const tensor& query = input[0];
//const tensor& value = input[1];
//const tensor& key = input.size() > 2 ? input[2] : value;
// https://towardsdatascience.com/transformers-explained-visually-part-3-multi-head-attention-deep-dive-1c1ff1024853
// https://dmol.pub/dl/attention.html#multi-head-attention-block
// https://github.com/keras-team/keras/blob/v2.14.0/keras/layers/attention/multi_head_attention.py
// https://gist.github.com/sevagh/b71d253a347a9b59c026580625452fc5
return input;
Expand Down

0 comments on commit 9743a49

Please sign in to comment.