hunterhector
commited on
Commit
·
b869cdf
1
Parent(s):
bcdfc3c
Update README.md
Browse files
README.md
CHANGED
@@ -11,12 +11,16 @@ tags:
|
|
11 |
|
12 |
# CrystalCoder
|
13 |
|
14 |
-
CrystalCoder is a state-of-the-art 7B parameter language model, distinctively trained on the SlimPajama and StarCoder datasets.
|
|
|
|
|
|
|
|
|
15 |
|
16 |
| Model | Trained Tokens | ARC | HellaSwag | MMLU (5-shot) | TruthfulQA | Language Avg. | HumanEval (pass@1) | MBPP (pass@1) | Coding Avg. | Avg. of Avg.|
|
17 |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
18 |
| Mistral 7B | - | 59.98 | 83.31 | 64.16 | 42.15 | 63.40 | 29.12 | 38.78 | 33.95 | 48.68 |
|
19 |
-
| **
|
20 |
| CodeLlaMA 7B | 2.5T | 39.93 | 60.80 | 31.12 | 37.82 | 42.42 | 33.50 | 41.40 | 37.45 | 39.94 |
|
21 |
| OpenLLaMA v2 7B | 1T | 43.60 | 72.20 | 41.29 | 35.54 | 48.18 | 15.32 | 12.69 | 28.01 | 38.10 |
|
22 |
| LLaMA 2 7B | 2T | 53.07 | 77.74 | 43.80 | 38.98 | 53.39 | 13.05 | 20.09 | 16.57 | 34.98 |
|
|
|
11 |
|
12 |
# CrystalCoder
|
13 |
|
14 |
+
CrystalCoder is a state-of-the-art 7B parameter language model, distinctively trained on the SlimPajama and StarCoder datasets.
|
15 |
+
This model excels in balancing natural language processing and coding capabilities.
|
16 |
+
Despite being trained on a smaller dataset of 1.4 trillion tokens—compared to LLaMA 2's 2 trillion—CrystalCoder surpasses LLaMA 2 in some challenging English and coding tasks.
|
17 |
+
It demonstrates superior performance in benchmarks like MMLU, HumanEval, and MBPP.
|
18 |
+
By comparing CrystalCoder with other similar work, CrystalCoder is quite balance on language and coding tasks.
|
19 |
|
20 |
| Model | Trained Tokens | ARC | HellaSwag | MMLU (5-shot) | TruthfulQA | Language Avg. | HumanEval (pass@1) | MBPP (pass@1) | Coding Avg. | Avg. of Avg.|
|
21 |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
22 |
| Mistral 7B | - | 59.98 | 83.31 | 64.16 | 42.15 | 63.40 | 29.12 | 38.78 | 33.95 | 48.68 |
|
23 |
+
| **CrystalCoder 7B** | 1.4T | 47.01 | 71.97 | 48.78 | 35.91 | 50.92 | 28.38 | 36.38 | 32.38 | 41.65 |
|
24 |
| CodeLlaMA 7B | 2.5T | 39.93 | 60.80 | 31.12 | 37.82 | 42.42 | 33.50 | 41.40 | 37.45 | 39.94 |
|
25 |
| OpenLLaMA v2 7B | 1T | 43.60 | 72.20 | 41.29 | 35.54 | 48.18 | 15.32 | 12.69 | 28.01 | 38.10 |
|
26 |
| LLaMA 2 7B | 2T | 53.07 | 77.74 | 43.80 | 38.98 | 53.39 | 13.05 | 20.09 | 16.57 | 34.98 |
|