T5 Tokenzier not load with AttributeError: add_special_tokens conflicts with the method add_special_tokens in T5Tokenizer
#36032
Labels
System Info
transformers
version: 4.48.2Who can help?
@ArthurZucker
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
Running this script should reproduce the error
It fails with
Expected behavior
I expected the tokenizer to load. Two similar issues were raised before but they did not solve my problem.
The text was updated successfully, but these errors were encountered: