-
@jklj077 I succesfully deploy a qwen model by openai_api.py , how can I deploy more models in same api , so that I can choose model by model_name in client , eg: llm_1 = ChatOpenAI(model_name="Qwen-1.8B",openai_api_base='same_api'), llm_2 = ChatOpenAI(model_name="Qwen-14B",openai_api_base='same_api') |
Beta Was this translation helpful? Give feedback.
Answered by
jklj077
Apr 18, 2024
Replies: 1 comment
-
I don't think it is supported. Frameworks like FastChat are more likely to support this usecase. |
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
samosun
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I don't think it is supported. Frameworks like FastChat are more likely to support this usecase.