Skip to content

Flash and Linear Attention mechanisms added to the TabTransformer

Compare
Choose a tag to compare
@jrzaurin jrzaurin released this 06 Aug 10:50
· 112 commits to master since this release
2ef478c
  1. Added Flash Attention
  2. Added Linear Attention
  3. Revisited and polished the docs