We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
python 3.8,cuda11.8
@zr
python cli_demo_sat.py --from_pretrained cogvlm-chat --version chat_old --bf16 --stream_chat
[2024-09-05 15:30:54,430] [WARNING] Failed to load bitsandbytes:No module named 'bitsandbytes' Traceback (most recent call last): File "/home/lt/2024/llm/CogVLM/basic_demo/cli_demo_sat.py", line 161, in main() File "/home/lt/2024/llm/CogVLM/basic_demo/cli_demo_sat.py", line 36, in main model, model_args = AutoModel.from_pretrained( File "/home/lt/miniconda3/lib/python3.10/site-packages/sat/model/base_model.py", line 342, in from_pretrained return cls.from_pretrained_base(name, args=args, home_path=home_path, url=url, prefix=prefix, build_only=build_only, overwrite_args=overwrite_args, **kwargs) File "/home/lt/miniconda3/lib/python3.10/site-packages/sat/model/base_model.py", line 317, in from_pretrained_base model_path = auto_create(name, path=home_path, url=url) File "/home/lt/miniconda3/lib/python3.10/site-packages/sat/resources/download.py", line 53, in auto_create url = MODEL_URLS[name] KeyError: 'cogvlm-chat'
The model will be downloaded.
The text was updated successfully, but these errors were encountered:
No branches or pull requests
System Info / 系統信息
python 3.8,cuda11.8
Who can help? / 谁可以帮助到您?
@zr
Information / 问题信息
Reproduction / 复现过程
python cli_demo_sat.py --from_pretrained cogvlm-chat --version chat_old --bf16 --stream_chat
[2024-09-05 15:30:54,430] [WARNING] Failed to load bitsandbytes:No module named 'bitsandbytes'
Traceback (most recent call last):
File "/home/lt/2024/llm/CogVLM/basic_demo/cli_demo_sat.py", line 161, in
main()
File "/home/lt/2024/llm/CogVLM/basic_demo/cli_demo_sat.py", line 36, in main
model, model_args = AutoModel.from_pretrained(
File "/home/lt/miniconda3/lib/python3.10/site-packages/sat/model/base_model.py", line 342, in from_pretrained
return cls.from_pretrained_base(name, args=args, home_path=home_path, url=url, prefix=prefix, build_only=build_only, overwrite_args=overwrite_args, **kwargs)
File "/home/lt/miniconda3/lib/python3.10/site-packages/sat/model/base_model.py", line 317, in from_pretrained_base
model_path = auto_create(name, path=home_path, url=url)
File "/home/lt/miniconda3/lib/python3.10/site-packages/sat/resources/download.py", line 53, in auto_create
url = MODEL_URLS[name]
KeyError: 'cogvlm-chat'
Expected behavior / 期待表现
The model will be downloaded.
The text was updated successfully, but these errors were encountered: