Tokenizer doesn't exist

#19
by vegarab - opened
from transformers import AutoModelForCausalLM, AutoTokenizer

model_id = "mistralai/Mistral-7B-v0.3"
tokenizer = AutoTokenizer.from_pretrained(model_id)

Provided example in the README yields a

OSError: Can't load tokenizer for 'mistralai/Mistral-7B-v0.3'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'mistralai/Mistral-7B-v0.3' is the correct path to a directory containing all relevant files for a LlamaTokenizerFast tokenizer.

error.

Can you please provide updated instructions for how to access the model's tokenizer?

Hi, I also encountered the same issue during runtime, but after trying to replace AutoTokenizer with LlamaTokenizer, I found that it works: tokenizer = LlamaTokenizer.from_pretrained(model_id)

Sign up or log in to comment