metadata
datasets:
- cerebras/SlimPajama-627B
language:
- en
This is the trained 1.3 billion parameter LLAMA-2 architecture model for the work Multi-Agent Collaborative Data Selection for Efficient LLM Pretraining
datasets:
- cerebras/SlimPajama-627B
language:
- en
This is the trained 1.3 billion parameter LLAMA-2 architecture model for the work Multi-Agent Collaborative Data Selection for Efficient LLM Pretraining