Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Weird warning when calling LlamaTokenizer.__call__ #29273

Closed
m-Just opened this issue Feb 24, 2024 · 0 comments · Fixed by #29346
Closed

Weird warning when calling LlamaTokenizer.__call__ #29273

m-Just opened this issue Feb 24, 2024 · 0 comments · Fixed by #29346

Comments

@m-Just
Copy link

m-Just commented Feb 24, 2024

When I run the following code:

from transformers import LlamaTokenizer

model_path_or_name = "lmsys/vicuna-7b-v1.5"
tokenizer = LlamaTokenizer.from_pretrained(model_path_or_name)

prompt = 'hi'
tokenizer(prompt, return_tensors="pt")

it produces a weird warning that says:
Keyword arguments {'add_special_tokens': False} not recognized.

I traced the warning to this line which calls PreTrainedTokenizer.tokenize with the keyword argument add_special_tokens. It seems that the argument should be handled by self.prepare_for_tokenization inside PreTrainedTokenizer.tokenize, but it is not; leading to the warning.

Transformers version: 4.38.1. (BTW, 4.37.2 does not have this issue)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
1 participant