You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I run it on macbook m2 air, when run:
python tools/download_model.py --model VILA_7B_awq_int4_CLIP_ViT-L --QM QM_ARM
I got this:
Start downloading the model to ./LLaMA2_7B_chat_awq_int4.zip.
172KB [00:00, 267.55KB/s]
File downloaded successfully: ./LLaMA2_7B_chat_awq_int4.zip
The md5sum of the file does not match the expected md5sum. Expected: af20c96de302c503a9fcfd5877ed0600, got: fd4917f2c0669b14a56d5be6a370bc75
then run:
make chat -j
./chat
TinyChatEngine by MIT HAN Lab: https://github.com/mit-han-lab/TinyChatEngine
Using model: LLaMA2_7B_chat
Using AWQ for 4bit quantization: https://github.com/mit-han-lab/llm-awq
Loading model... No such file or directory: INT4/models/LLaMA_7B_2_chat/decoder/embed_tokens/weight.bin
libc++abi: terminating due to uncaught exception of type char const*
zsh: abort ./chat
The text was updated successfully, but these errors were encountered:
File downloaded successfully: ./LLaMA2_7B_chat_awq_int4.zip
The md5sum of the file does not match the expected md5sum. Expected: af20c96de302c503a9fcfd5877ed0600, got: fd4917f2c0669b14a56d5be6a370bc75
your downloaded zip are not unzipped correctly, maybe try manually unzip?
hello, I am also facing the same problem. Model LLaMA_3_8B_Instruct_awq_int4. Have you guys solve this problem? Really appreciate if you could give me some advice.
I run it on macbook m2 air, when run:
python tools/download_model.py --model VILA_7B_awq_int4_CLIP_ViT-L --QM QM_ARM
I got this:
Start downloading the model to ./LLaMA2_7B_chat_awq_int4.zip.
172KB [00:00, 267.55KB/s]
File downloaded successfully: ./LLaMA2_7B_chat_awq_int4.zip
The md5sum of the file does not match the expected md5sum. Expected: af20c96de302c503a9fcfd5877ed0600, got: fd4917f2c0669b14a56d5be6a370bc75
then run:
make chat -j
./chat
TinyChatEngine by MIT HAN Lab: https://github.com/mit-han-lab/TinyChatEngine
Using model: LLaMA2_7B_chat
Using AWQ for 4bit quantization: https://github.com/mit-han-lab/llm-awq
Loading model... No such file or directory: INT4/models/LLaMA_7B_2_chat/decoder/embed_tokens/weight.bin
libc++abi: terminating due to uncaught exception of type char const*
zsh: abort ./chat
The text was updated successfully, but these errors were encountered: