Phi 3.5 MOE and supporting MOE in general #636
niceblue88
started this conversation in
General
Replies: 1 comment
-
realised there was another thread important for this: #512 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
What does it take to get a new MOE model running in exl2 format? Obviously there are a few in the past, notably Mixtral 8x7b and DBRX. What configuration or functions needed to be added to support, say, Phi 3.5 MOE? And can this process be generalized for any future MOE LLMs released. I'm happy to try to do it, but not sure how to start,
Beta Was this translation helpful? Give feedback.
All reactions