Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Datasets:
Shahzebbb
/
bookcorpus_tokenized_split
like
0
Formats:
parquet
Size:
1B - 10B
Libraries:
Datasets
Dask
Croissant
+ 1
Dataset card
Viewer
Files
Files and versions
Community
main
bookcorpus_tokenized_split
/
data
1 contributor
History:
1 commit
Shahzebbb
Upload dataset
b1cd1e3
verified
9 months ago
train-00000-of-00015.parquet
195 MB
LFS
Upload dataset
9 months ago
train-00001-of-00015.parquet
193 MB
LFS
Upload dataset
9 months ago
train-00002-of-00015.parquet
199 MB
LFS
Upload dataset
9 months ago
train-00003-of-00015.parquet
199 MB
LFS
Upload dataset
9 months ago
train-00004-of-00015.parquet
203 MB
LFS
Upload dataset
9 months ago
train-00005-of-00015.parquet
198 MB
LFS
Upload dataset
9 months ago
train-00006-of-00015.parquet
194 MB
LFS
Upload dataset
9 months ago
train-00007-of-00015.parquet
198 MB
LFS
Upload dataset
9 months ago
train-00008-of-00015.parquet
201 MB
LFS
Upload dataset
9 months ago
train-00009-of-00015.parquet
196 MB
LFS
Upload dataset
9 months ago
train-00010-of-00015.parquet
198 MB
LFS
Upload dataset
9 months ago
train-00011-of-00015.parquet
203 MB
LFS
Upload dataset
9 months ago
train-00012-of-00015.parquet
202 MB
LFS
Upload dataset
9 months ago
train-00013-of-00015.parquet
193 MB
LFS
Upload dataset
9 months ago
train-00014-of-00015.parquet
193 MB
LFS
Upload dataset
9 months ago
validation-00000-of-00002.parquet
164 MB
LFS
Upload dataset
9 months ago
validation-00001-of-00002.parquet
162 MB
LFS
Upload dataset
9 months ago