LMFlavorGraph-tokenizer / tokenizer_config.json
SEO
Upload tokenizer files
bc8845f
raw
history blame contribute delete
158 Bytes
{"model": "wordlevel", "vocab_size": 8287, "unk_token": "[UNK]", "pad_token": "[PAD]", "mask_token": "[MASK]", "special_tokens": ["[PAD]", "[UNK]", "[MASK]"]}