You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I loaded model model = GenericLoraKbitModel('aleksickx/llama-7b-hf') in examples/features/int4_finetuning/LLaMA_lora_int4.ipynb, I got the following error message
RecursionError: maximum recursion depth exceeded while calling a Python object
According to the issue mentioned in huggingface/transformers#22762 , it seems that the tokenizer of llama-7b-hf is not compatible with the latest version of transformer?
Can you provide the version of transformers(or other libs) that can successfully run the code in your notebook?
The text was updated successfully, but these errors were encountered:
My System Info
When I loaded model
model = GenericLoraKbitModel('aleksickx/llama-7b-hf')
inexamples/features/int4_finetuning/LLaMA_lora_int4.ipynb
, I got the following error messageAccording to the issue mentioned in huggingface/transformers#22762 , it seems that the tokenizer of llama-7b-hf is not compatible with the latest version of transformer?
Can you provide the version of transformers(or other libs) that can successfully run the code in your notebook?
The text was updated successfully, but these errors were encountered: