File size: 249 Bytes
8a490b5
 
 
 
 
1628a96
 
dbc0bdc
1
2
3
4
5
6
7
8
---
datasets:
- cerebras/SlimPajama-627B
language:
- en
---

This is the trained 1.3 billion parameter LLAMA-2 architecture model for the work [Multi-Agent Collaborative Data Selection for Efficient LLM Pretraining](https://arxiv.org/pdf/2410.08102)