Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Model type "internlm/internlm2_5-7b-chat" is not supported #205

Open
hydra-bu opened this issue Sep 17, 2024 · 0 comments
Open

Model type "internlm/internlm2_5-7b-chat" is not supported #205

hydra-bu opened this issue Sep 17, 2024 · 0 comments

Comments

@hydra-bu
Copy link

使用mac部署云端服务模式,提示模型不支持。

docker-compose 配置如下:

    environment:
      - PYTHONUNBUFFERED=1
      - SILICON_API_KEY="sk-xxxx"
      - SILICON_MODEL="internlm/internlm2_5-7b-chat"
    command: python -m mindsearch.app --lang cn --model_format internlm_silicon --search_engine BingSearch
INFO:     Started server process [1]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     Uvicorn running on http://0.0.0.0:8002 (Press CTRL+C to quit)


INFO:     172.18.0.3:57556 - "POST /solve HTTP/1.1" 200 OK
ERROR:root:Exception in sync_generator_wrapper: Model type "internlm/internlm2_5-7b-chat" is not supported

另外MSDL在mac上执行报错文件没有找到,只好手工调整生成docker-compose,根据文档说明运行成功,但是执行搜索提示模型不支持。尝试其他模型也提示不支持:

ERROR:root:Exception in sync_generator_wrapper: Model type "Qwen/Qwen2-7B-Instruct" is not supported
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant