File size: 467 Bytes
472d33f
 
 
 
 
 
 
 
 
37593df
 
c9497cb
37593df
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
---
license: mit
task_categories:
- text-generation
language:
- en
pretty_name: fineweb-sample-100BT-tokenized-45b
size_categories:
- 10B<n<100B
---

[Fineweb sample-100BT](https://huggingface.co/datasets/HuggingFaceFW/fineweb) tokenized with the [gpt2 tokenizer](https://huggingface.co/openai-community/gpt2). Sequences are truncated to a length of 1024 tokens.

For efficiency, tokens are stored in uint16 numpy arrays. Shards are stored in the webdataset format.