-
Notifications
You must be signed in to change notification settings - Fork 28.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ValueError: Tokenizer class T5Tokenizer does not exist or is not currently imported. #9250
Comments
Hey @nsankar, I cannot reproduce the above error concerning the tokenizer. The tokenizer is loaded correctly in my command line. @mrm8488 when I load the model via: model = AutoModelForSeq2SeqLM.from_pretrained("mrm8488/mT5-small-finetuned-tydiqa-for-xqa") I get the following warning:
-> I think the weights uploaded here correspond to the "old" T5 version. It would be awesome if you could check the weights :-) |
Thanks @patrickvonplaten. I will check it out, ASAP. |
It seems to happen with other models: from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("moussaKam/mbarthez")
Traceback (most recent call last):
File "/home/user/.local/share/virtualenvs/project/lib/python3.8/site-packages/IPython/core/interactiveshell.py", line 3418, in run_code
exec(code_obj, self.user_global_ns, self.user_ns)
File "<ipython-input-4-028660e65504>", line 3, in <module>
tokenizer = AutoTokenizer.from_pretrained("moussaKam/mbarthez")
File "/home/user/.local/share/virtualenvs/project/lib/python3.8/site-packages/transformers/models/auto/tokenization_auto.py", line 359, in from_pretrained
raise ValueError(
ValueError: Tokenizer class BarthezTokenizer does not exist or is not currently imported. And:
|
This issue has been automatically marked as stale and been closed because it has not had recent activity. Thank you for your contributions. If you think this still needs to be addressed please comment on this thread. |
I had a similar problem Seems that when missing the |
This works fabulously with DeBerta models as well, seems that the error isn't very descriptive. |
I think on current master a better error message is given when |
while it doesn't work for me. :-( ` ValueError: Tokenizer class BloomTokenizerFast does not exist or is not currently imported. |
well, newest version of transformers works for me. |
I'm getting the same error with transformers==4.26 when trying to load ernie-m-base with
The exact same code worked two days ago with XLM-V. I've made sure that sentencepiece is installed. Edit: Ah I think the error currently comes up because ernie-m is on the hub, but not yet merged into master for transformers #21349 (?) |
@mfuntowicz
Environment info
transformers
version: Latest transformers==4.2.0.dev0Who can help
@mfuntowicz
Information
The following code indicated in the latest HF news letter seems to have isssues when I tried
I get tokenizer error both under Fast and Slow (True/Flase tokenizer parameter) conditions when I had checked
The problem arises when using:
To reproduce
Steps to reproduce the behavior:
ERROR reported
`ValueError Traceback (most recent call last)
in ()
10 from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
11
---> 12 tokenizer = AutoTokenizer.from_pretrained("mrm8488/mT5-small-finetuned-tydiqa-for-xqa",use_fast=False )
13
14 model = AutoModelForSeq2SeqLM.from_pretrained("mrm8488/mT5-small-finetuned-tydiqa-for-xqa")
/usr/local/lib/python3.6/dist-packages/transformers/models/auto/tokenization_auto.py in from_pretrained(cls, pretrained_model_name_or_path, *inputs, **kwargs)
358 if tokenizer_class is None:
359 raise ValueError(
--> 360 "Tokenizer class {} does not exist or is not currently imported.".format(tokenizer_class_candidate)
361 )
362 return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
ValueError: Tokenizer class T5Tokenizer does not exist or is not currently imported.`
The text was updated successfully, but these errors were encountered: