Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RecursionError: maximum recursion depth exceeded while calling a Python object #287

Open
ForAxel opened this issue Oct 16, 2024 · 0 comments

Comments

@ForAxel
Copy link

ForAxel commented Oct 16, 2024

My System Info

Python 3.10.15
torch 2.4.1
transformers 4.31.0
xturing 0.1.8
sentencepiece 0.1.99

When I loaded model model = GenericLoraKbitModel('aleksickx/llama-7b-hf') in examples/features/int4_finetuning/LLaMA_lora_int4.ipynb, I got the following error message

RecursionError: maximum recursion depth exceeded while calling a Python object

According to the issue mentioned in huggingface/transformers#22762 , it seems that the tokenizer of llama-7b-hf is not compatible with the latest version of transformer?

Can you provide the version of transformers(or other libs) that can successfully run the code in your notebook?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant