Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Transformer as embedding net #1413

Open
3 tasks
manuelgloeckler opened this issue Feb 27, 2025 · 0 comments
Open
3 tasks

Add Transformer as embedding net #1413

manuelgloeckler opened this issue Feb 27, 2025 · 0 comments
Labels
embedding_net New default embedding nets enhancement New feature or request hackathon

Comments

@manuelgloeckler
Copy link
Contributor

manuelgloeckler commented Feb 27, 2025

🚀 Feature Request

Transformers can be a flexible embedding network for general data modalities. We currently have permutation-invariant networks, whereas plain transformers are permutation equivariant (allowing support for exchangeable but not independent data). With suitable positional embeddings, this can also serve as a general embedding network.

Describe the solution you'd like

Todo so the following steps have to be completed:

  • Add a PyTorch transformer class here
  • Currently, all flows will need a "statically" size input. So, the output sequence of the transformer needs to be "pooled" into a single vector of fixed dimension. There are multiple ways to do this and this needs some testing/literature research on what we want to use as default (but multiple methods can be implemented).
  • Add tests

📌 Additional Context

Currently, other "sequence" models like the permutation-invariant networks support learning on sequences of different sizes in parallel using "nan"-padding. One can think of adding this support here, too (if not please add an additional issue).

The issue #1324 #218 does currently soft-block variable sequence lengths, but should not have an effect on this feature request.

@manuelgloeckler manuelgloeckler added enhancement New feature or request hackathon embedding_net New default embedding nets labels Feb 27, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
embedding_net New default embedding nets enhancement New feature or request hackathon
Projects
None yet
Development

No branches or pull requests

1 participant