Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Datasets:
ZurabDz
/
tokenized_large_corpus_v2
like
0
Formats:
parquet
Size:
10M - 100M
Libraries:
Datasets
Dask
Croissant
+ 1
Dataset card
Data Studio
Files
Files and versions
Community
refs/convert/parquet
tokenized_large_corpus_v2
/
default
/
train
1 contributor
History:
3 commits
parquet-converter
Delete old duckdb index files
ba2b6f4
verified
about 1 year ago
0000.parquet
Safe
179 MB
LFS
Update parquet files
over 1 year ago
0001.parquet
Safe
173 MB
LFS
Update parquet files
over 1 year ago
0002.parquet
Safe
165 MB
LFS
Update parquet files
over 1 year ago
0003.parquet
Safe
167 MB
LFS
Update parquet files
over 1 year ago
0004.parquet
Safe
174 MB
LFS
Update parquet files
over 1 year ago
0005.parquet
Safe
169 MB
LFS
Update parquet files
over 1 year ago
0006.parquet
Safe
169 MB
LFS
Update parquet files
over 1 year ago
0007.parquet
Safe
174 MB
LFS
Update parquet files
over 1 year ago
0008.parquet
Safe
178 MB
LFS
Update parquet files
over 1 year ago
0009.parquet
Safe
178 MB
LFS
Update parquet files
over 1 year ago
0010.parquet
Safe
180 MB
LFS
Update parquet files
over 1 year ago
0011.parquet
Safe
181 MB
LFS
Update parquet files
over 1 year ago
0012.parquet
Safe
173 MB
LFS
Update parquet files
over 1 year ago
0013.parquet
Safe
170 MB
LFS
Update parquet files
over 1 year ago