SEO commited on
Commit
6d5302c
·
1 Parent(s): 935a594

Upload tokenizer files

Browse files
Files changed (3) hide show
  1. tokenizer.json +0 -0
  2. tokenizer_config.json +1 -0
  3. vocab.json +0 -0
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"model": "wordlevel", "vocab_size": 8286, "unk_token": "[UNK]", "special_tokens": ["[PAD]", "[UNK]"]}
vocab.json ADDED
The diff for this file is too large to render. See raw diff