Skip to content

AttributeError: module transformers.models.llama has no attribute LLaMATokenizer #1

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
corranmac opened this issue Mar 9, 2023 · 4 comments

Comments

@corranmac
Copy link

Hi I can load the model fine via model = transformers.LLaMAForCausalLM.from_pretrained("/content/drive/MyDrive/llama-13b-hf/")
but Im not finding the LLaMATokenizer, so receiving the error AttributeError: module transformers.models.llama has no attribute LLaMATokenizer

@galatolofederico
Copy link

galatolofederico commented Mar 9, 2023

You just need to pip install sentencepiece

The error is silent from here

    try:
        if not is_sentencepiece_available():
            raise OptionalDependencyNotAvailable()
    except OptionalDependencyNotAvailable:
        pass
    else:
        from .tokenization_llama import LLaMATokenizer

@chlee29
Copy link

chlee29 commented Mar 13, 2023

Hi I installed sentencepiece, but still got same error..

@GamerUntouch
Copy link

You need this version of transformers, I think a recent update messed something up.
https://github.com/mbehm/transformers

In whatever file you're looking for, change the LLaMATokenizer to LlamaTokenizer.

@discoelysiumLW
Copy link

For right now the model should be load as: AutoModelForCausalLM
also the tokenizer: AutoTokenizer

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants