Skip to content

Commit

Permalink
Add comment with link to stackoverflow question
Browse files Browse the repository at this point in the history
  • Loading branch information
Dobiasd committed Nov 1, 2023
1 parent 60f77d5 commit 14aa03a
Showing 1 changed file with 1 addition and 0 deletions.
1 change: 1 addition & 0 deletions include/fdeep/layers/multi_head_attention_layer.hpp
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,7 @@ class multi_head_attention_layer : public layer
tensors apply_impl(const tensors& input) const override
{
// input.size() is 1. How shall the other tensors passed here? How is it in TF?
// https://stackoverflow.com/questions/77400589/what-is-the-reason-for-multiheadattention-having-a-different-call-convention-tha
// todo: implement
return input;
}
Expand Down

0 comments on commit 14aa03a

Please sign in to comment.