Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Where is the implementation of the whole moe in example/translation/translation_moe #5590

Open
MastrOrigami opened this issue Jan 16, 2025 · 0 comments

Comments

@MastrOrigami
Copy link

❓ Questions and Help

I don't know where is the implementation of the whole MoE architecture in example/translation/translation_moe.
It seems like the example/translation/translation_moe/translation_moe_src has the implementation of the class TranslationMoETask and mean_pool_gating_network, but I don't know where the implementation of the experts, the moe just use the TransformerModel and add the gating_network attribution in it in translation_moe.py. Is there no implementation of the experts, and whether ALL of the moe is in /fairseq/examples/translation_moe/translation_moe_src ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant