Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Transformer takes input length of 1? #7

Open
pauln2k opened this issue Mar 19, 2024 · 2 comments
Open

Transformer takes input length of 1? #7

pauln2k opened this issue Mar 19, 2024 · 2 comments

Comments

@pauln2k
Copy link

pauln2k commented Mar 19, 2024

No description provided.

@pauln2k
Copy link
Author

pauln2k commented Mar 19, 2024

Looks like your second transformer just takes a sequence length of 1. is this correct? What's the point of the transformer if the input is only one sequence long?

@pauln2k pauln2k changed the title Looks like your second transformer just takes a sequence length of 1. is this correct? What's the point of the transformer if the input is only one sequence long? Transformer takes input length of 1? Mar 19, 2024
@XiuzeZhou
Copy link
Owner

Our model has two modes: if setting the input length at 1, transformer is treated as a encoder; else it treated as a sequence model.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants