Qwen2 Collection Qwen2 language models, including pretrained and instruction-tuned models of 5 sizes, including 0.5B, 1.5B, 7B, 57B-A14B, and 72B. • 39 items • Updated Nov 28, 2024 • 354
Scaling Laws for Linear Complexity Language Models Paper • 2406.16690 • Published Jun 24, 2024 • 22
SSMs Collection A collection of Mamba-2-based research models with 8B parameters trained on 3.5T tokens for comparison with Transformers. • 5 items • Updated about 14 hours ago • 26
CO2: Efficient Distributed Training with Full Communication-Computation Overlap Paper • 2401.16265 • Published Jan 29, 2024 • 1
OpenNLPLab/TransNormerLLM3-15B-Intermediate-Checkpoints Text Generation • Updated Apr 7, 2024 • 21 • 15
Lightning Attention-2: A Free Lunch for Handling Unlimited Sequence Lengths in Large Language Models Paper • 2401.04658 • Published Jan 9, 2024 • 25
Lightning Attention-2: A Free Lunch for Handling Unlimited Sequence Lengths in Large Language Models Paper • 2401.04658 • Published Jan 9, 2024 • 25