You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have some trouble testing the inference time. I set both sequence length and batch size equal to 1 and tested the inference time on a single RTX4090 GPU. I found the time for lstm is about 4 ms, while the s5 block takes 7 ms.
I tried to exchange the forward to forward_rnn to evaluate the inference time as you mentioned, but I got the same error as @yuyangpoireported.
Therefore, I wonder how to test the inference time of S5 block is correctly?
The text was updated successfully, but these errors were encountered:
Hi @NikolaZubic
I have some trouble testing the inference time. I set both sequence length and batch size equal to 1 and tested the inference time on a single RTX4090 GPU. I found the time for lstm is about 4 ms, while the s5 block takes 7 ms.
I tried to exchange the forward to forward_rnn to evaluate the inference time as you mentioned, but I got the same error as @yuyangpoi reported.
Therefore, I wonder how to test the inference time of S5 block is correctly?
The text was updated successfully, but these errors were encountered: