You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I don't know where is the implementation of the whole MoE architecture in example/translation/translation_moe.
It seems like the example/translation/translation_moe/translation_moe_src has the implementation of the class TranslationMoETask and mean_pool_gating_network, but I don't know where the implementation of the experts, the moe just use the TransformerModel and add the gating_network attribution in it in translation_moe.py. Is there no implementation of the experts, and whether ALL of the moe is in /fairseq/examples/translation_moe/translation_moe_src ?
The text was updated successfully, but these errors were encountered:
❓ Questions and Help
I don't know where is the implementation of the whole MoE architecture in
example/translation/translation_moe
.It seems like the
example/translation/translation_moe/translation_moe_src
has the implementation of theclass TranslationMoETask
andmean_pool_gating_network
, but I don't know where the implementation of the experts, the moe just use theTransformerModel
and add thegating_network
attribution in it intranslation_moe.py
. Is there no implementation of the experts, and whether ALL of the moe is in/fairseq/examples/translation_moe/translation_moe_src
?The text was updated successfully, but these errors were encountered: