beccabai's picture
Update README.md
dbc0bdc verified
|
raw
history blame
249 Bytes
metadata
datasets:
  - cerebras/SlimPajama-627B
language:
  - en

This is the trained 1.3 billion parameter LLAMA-2 architecture model for the work Multi-Agent Collaborative Data Selection for Efficient LLM Pretraining