This repository has been archived by the owner on Sep 7, 2023. It is now read-only.
-
-
Notifications
You must be signed in to change notification settings - Fork 11
Try 2d, 3d CNN before attention #63
Labels
Comments
Nice! How well does the new model perform?! :) |
I dont have a direct comparision to running it without the cnn. |
Cool beans. No rush, but it might be nice to have a direct comparison, so we can get a feel for whether it's useful to place a CNN before the attention. What do you think? |
yea, I think it would be a useful to do. Could also see if 01, ... N layers of CNN are useful? |
sounds great! |
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
related: #35
3D cnn: sees two timesteps at once
The text was updated successfully, but these errors were encountered: