model_type
stringclasses
4 values
model
stringlengths
13
62
AVG
float64
0.04
0.7
CG
float64
0
0.68
EL
float64
0
0.62
FA
float64
0
0.35
HE
float64
0
0.79
MC
float64
0
0.92
MR
float64
0
0.95
MT
float64
0.3
0.86
NLI
float64
0
0.82
QA
float64
0.01
0.77
RC
float64
0.04
0.93
SUM
float64
0
0.29
aio_char_f1
float64
0
0.9
alt-e-to-j_bert_score_ja_f1
float64
0.26
0.88
alt-e-to-j_bleu_ja
float64
0.32
16
alt-e-to-j_comet_wmt22
float64
0.29
0.92
alt-j-to-e_bert_score_en_f1
float64
0.37
0.96
alt-j-to-e_bleu_en
float64
0.02
20.1
alt-j-to-e_comet_wmt22
float64
0.3
0.89
chabsa_set_f1
float64
0
0.62
commonsensemoralja_exact_match
float64
0
0.94
jamp_exact_match
float64
0
0.74
janli_exact_match
float64
0
0.95
jcommonsenseqa_exact_match
float64
0
0.97
jemhopqa_char_f1
float64
0
0.71
jmmlu_exact_match
float64
0
0.76
jnli_exact_match
float64
0
0.9
jsem_exact_match
float64
0
0.81
jsick_exact_match
float64
0
0.87
jsquad_char_f1
float64
0.04
0.93
jsts_pearson
float64
-0.23
0.91
jsts_spearman
float64
-0.19
0.88
kuci_exact_match
float64
0
0.86
mawps_exact_match
float64
0
0.95
mbpp_code_exec
float64
0
0.68
mbpp_pylint_check
float64
0
0.99
mmlu_en_exact_match
float64
0
0.82
niilc_char_f1
float64
0.01
0.7
wiki_coreference_set_f1
float64
0
0.13
wiki_dependency_set_f1
float64
0
0.55
wiki_ner_set_f1
float64
0
0.17
wiki_pas_set_f1
float64
0
0.12
wiki_reading_char_f1
float64
0.02
0.91
wikicorpus-e-to-j_bert_score_ja_f1
float64
0.15
0.87
wikicorpus-e-to-j_bleu_ja
float64
0.17
18.3
wikicorpus-e-to-j_comet_wmt22
float64
0.3
0.87
wikicorpus-j-to-e_bert_score_en_f1
float64
0.26
0.92
wikicorpus-j-to-e_bleu_en
float64
0.03
13.8
wikicorpus-j-to-e_comet_wmt22
float64
0.28
0.78
xlsum_ja_bert_score_ja_f1
float64
0
0.79
xlsum_ja_bleu_ja
float64
0
10.2
xlsum_ja_rouge1
float64
0.05
52.8
xlsum_ja_rouge2
float64
0.01
29.2
xlsum_ja_rouge2_scaling
float64
0
0.29
xlsum_ja_rougeLsum
float64
0.05
44.9
architecture
stringclasses
11 values
precision
stringclasses
2 values
license
stringclasses
11 values
params
float64
0.14
70.6
likes
int64
0
4.03k
revision
stringclasses
1 value
num_few_shot
int64
0
4
add_special_tokens
stringclasses
2 values
llm_jp_eval_version
stringclasses
1 value
vllm_version
stringclasses
1 value
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Superb-V6-70B
0.6225
0.0964
0.5783
0.2658
0.7727
0.886
0.936
0.8401
0.8032
0.7044
0.9085
0.0564
0.8308
0.8539
13.1706
0.8893
0.9587
18.6605
0.8894
0.5783
0.9076
0.704
0.9222
0.9446
0.661
0.7328
0.7342
0.8074
0.848
0.9085
0.8722
0.8504
0.8058
0.936
0.0964
0.1827
0.8126
0.6214
0.0344
0.3689
0.0177
0.0252
0.8829
0.8423
16.1434
0.8297
0.9081
12.3161
0.752
0.6416
2.9336
15.9098
5.6527
0.0564
13.6909
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Superb-V6-70B
0.3656
0.0964
0.2202
0.1608
0.0054
0.865
0.062
0.8101
0.7409
0.2699
0.7342
0.0564
0.4067
0.8458
12.8299
0.8743
0.9479
17.915
0.8607
0.2202
0.9093
0.6839
0.8125
0.9249
0.1984
0.0017
0.719
0.6168
0.8725
0.7342
0.4631
0.8171
0.7608
0.062
0.0964
0.1827
0.009
0.2044
0.0025
0.0046
0
0
0.7971
0.7971
12.7938
0.7737
0.8986
11.4304
0.7318
0.6416
2.9336
15.9098
5.6527
0.0564
13.6909
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Superb-V7-70B
0.6593
0.4779
0.5789
0.2861
0.7574
0.9062
0.946
0.774
0.7986
0.712
0.925
0.0903
0.8266
0.862
14.9443
0.8927
0.9204
19.2739
0.7848
0.5789
0.9223
0.6638
0.8917
0.9544
0.6658
0.7232
0.7839
0.803
0.8508
0.925
0.9014
0.8726
0.8419
0.946
0.4779
0.8394
0.7915
0.6436
0.1132
0.4228
0.115
0.0799
0.6994
0.7891
15.9759
0.7081
0.8908
12.5414
0.7104
0.681
3.745
20.7696
9.0219
0.0903
17.1776
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Superb-V7-70B
0.5296
0.4779
0.3245
0.1605
0.7061
0.8782
0.716
0.7239
0.7654
0.2549
0.7281
0.0903
0.2285
0.869
14.1359
0.9122
0.8573
18.1692
0.5904
0.3245
0.9143
0.7098
0.7569
0.9437
0.2498
0.6846
0.7145
0.7948
0.8508
0.7281
0.9061
0.872
0.7766
0.716
0.4779
0.8394
0.7276
0.2863
0.0009
0.0135
0.0088
0.0099
0.7693
0.8288
11.3951
0.8396
0.834
11.2014
0.5533
0.681
3.745
20.7696
9.0219
0.0903
17.1776
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
sthenno-com/miscii-14b-1225
0.558
0.5924
0.231
0.1505
0.5701
0.8561
0.826
0.8386
0.7456
0.3605
0.863
0.1044
0.4587
0.8501
10.8416
0.9028
0.952
16.0986
0.8808
0.231
0.894
0.6178
0.8236
0.933
0.3379
0.6733
0.7671
0.7033
0.8161
0.863
0.8901
0.8611
0.7414
0.826
0.5924
0.9458
0.4669
0.2851
0.0137
0.0096
0
0.0008
0.7286
0.8029
8.2491
0.8258
0.895
8.9185
0.7451
0.6977
2.9606
27.9443
10.4524
0.1044
24.4193
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
17
main
0
False
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
sthenno-com/miscii-14b-1225
0.6454
0.5924
0.567
0.2604
0.7409
0.8806
0.89
0.8478
0.7708
0.5391
0.9064
0.1044
0.5488
0.8637
12.9486
0.9086
0.9539
17.5864
0.8838
0.567
0.897
0.6293
0.8194
0.9526
0.5754
0.7136
0.8505
0.7797
0.7749
0.9064
0.8923
0.8664
0.7922
0.89
0.5924
0.9458
0.7682
0.4931
0.0933
0.3381
0
0.071
0.7994
0.8293
11.1729
0.8409
0.9046
10.8587
0.7578
0.6977
2.9606
27.9443
10.4524
0.1044
24.4193
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
17
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
rombodawg/Rombos-LLM-V2.5-Qwen-32b
0.5456
0.5321
0.104
0.1468
0.5748
0.874
0.794
0.8386
0.7642
0.3892
0.8869
0.0973
0.441
0.8486
11.2433
0.901
0.9512
15.6734
0.8802
0.104
0.9043
0.6494
0.7944
0.9303
0.2681
0.5645
0.8188
0.7961
0.7623
0.8869
0.8954
0.8763
0.7875
0.794
0.5321
0.761
0.5851
0.4585
0.0281
0.0068
0.0354
0.0048
0.6589
0.8002
8.7465
0.8266
0.8967
9.6053
0.7466
0.6931
2.7803
25.9887
9.7254
0.0973
22.6735
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
48
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
rombodawg/Rombos-LLM-V2.5-Qwen-32b
0.6558
0.5321
0.588
0.2738
0.7769
0.8969
0.944
0.8476
0.8105
0.541
0.9054
0.0973
0.5542
0.8643
13.2226
0.9077
0.9553
17.6746
0.8859
0.588
0.8985
0.6753
0.8417
0.958
0.5672
0.7535
0.8977
0.7797
0.8579
0.9054
0.8895
0.8772
0.8341
0.944
0.5321
0.761
0.8003
0.5015
0.054
0.3845
0
0.1132
0.8175
0.8291
11.0412
0.8387
0.9044
11.1234
0.7581
0.6931
2.7803
25.9887
9.7254
0.0973
22.6735
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
48
main
4
False
v1.4.1
v0.6.3.post1
🤝 : base merges and moerges
maldv/Qwentile2.5-32B-Instruct
0.5715
0.3233
0.234
0.1439
0.7444
0.8716
0.866
0.8454
0.7511
0.4806
0.9113
0.1146
0.4784
0.8528
12.0338
0.9053
0.9549
16.6195
0.8865
0.234
0.9033
0.6322
0.7597
0.9357
0.5139
0.7182
0.8426
0.803
0.7179
0.9113
0.8892
0.8675
0.7757
0.866
0.3233
0.4116
0.7705
0.4496
0.0025
0.0069
0.0088
0.0082
0.6929
0.8076
9.3966
0.8356
0.8995
9.9257
0.7543
0.706
3.1737
28.67
11.4479
0.1146
24.92
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
28
main
0
False
v1.4.1
v0.6.3.post1
🤝 : base merges and moerges
maldv/Qwentile2.5-32B-Instruct
0.6457
0.3233
0.5805
0.2917
0.7864
0.8992
0.946
0.8506
0.8059
0.5826
0.9213
0.1146
0.5838
0.8662
13.2438
0.91
0.9557
17.5366
0.8856
0.5805
0.9058
0.6897
0.8236
0.9607
0.6368
0.7639
0.8948
0.8093
0.8121
0.9213
0.903
0.8802
0.8312
0.946
0.3233
0.4116
0.8089
0.5272
0.0321
0.3916
0.1239
0.0975
0.8134
0.8399
12.9473
0.8439
0.9085
11.632
0.7631
0.706
3.1737
28.67
11.4479
0.1146
24.92
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
28
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
arcee-ai/Virtuoso-Small
0.5215
0.5241
0.231
0.1427
0.3419
0.8596
0.764
0.8301
0.7379
0.3427
0.8706
0.0923
0.427
0.8455
10.4153
0.8959
0.9468
16.487
0.8664
0.231
0.895
0.6034
0.8042
0.9401
0.3144
0.564
0.7502
0.7184
0.8135
0.8706
0.896
0.8691
0.7436
0.764
0.5241
0.8896
0.1198
0.2868
0.0246
0.0044
0
0.0024
0.6818
0.7962
8.436
0.8199
0.8924
9.3765
0.7383
0.6867
2.5992
24.1217
9.2262
0.0923
21.1957
Qwen2ForCausalLM
bfloat16
apache-2.0
14.77
45
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
arcee-ai/Virtuoso-Small
0.639
0.5241
0.569
0.2468
0.7373
0.8867
0.888
0.8482
0.7807
0.5483
0.9075
0.0923
0.5425
0.8634
12.4457
0.9067
0.9544
17.0927
0.885
0.569
0.9053
0.6494
0.8236
0.9571
0.5787
0.7077
0.8624
0.7753
0.7928
0.9075
0.8861
0.8707
0.7978
0.888
0.5241
0.8896
0.767
0.5238
0.0753
0.3179
0
0.063
0.7778
0.8277
11.244
0.8421
0.9042
10.7569
0.759
0.6867
2.5992
24.1217
9.2262
0.0923
21.1957
Qwen2ForCausalLM
bfloat16
apache-2.0
14.77
45
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
nbeerbower/Mistral-Nemo-Prism-12B-v2
0.4261
0.006
0.2622
0.1425
0.4994
0.6263
0.382
0.8333
0.6629
0.3403
0.8169
0.1149
0.4148
0.8448
10.7822
0.9011
0.9477
14.2615
0.8756
0.2622
0.5306
0.5086
0.7403
0.7954
0.2408
0.4648
0.5489
0.6806
0.8362
0.8169
0.8254
0.7816
0.5529
0.382
0.006
0.0382
0.534
0.3652
0.0025
0.0015
0.0111
0.0072
0.6901
0.7924
8.0262
0.8178
0.895
8.5782
0.7388
0.7082
2.4913
32.6038
11.4923
0.1149
27.7702
MistralForCausalLM
bfloat16
apache-2.0
12.248
3
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
nbeerbower/Mistral-Nemo-Prism-12B-v2
0.5444
0.006
0.4957
0.2553
0.5886
0.844
0.728
0.8441
0.7104
0.5038
0.8971
0.1149
0.5552
0.8595
11.9703
0.9068
0.9516
15.7131
0.8809
0.4957
0.886
0.5115
0.7681
0.9276
0.4677
0.5383
0.7839
0.7487
0.7398
0.8971
0.8749
0.8479
0.7184
0.728
0.006
0.0382
0.639
0.4885
0.0054
0.3391
0.0619
0.0782
0.792
0.8206
9.9457
0.8358
0.9014
9.3628
0.7532
0.7082
2.4913
32.6038
11.4923
0.1149
27.7702
MistralForCausalLM
bfloat16
apache-2.0
12.248
3
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Avengers-V3-70B
0.6425
0.2088
0.5941
0.2889
0.7778
0.9027
0.944
0.8525
0.8038
0.6987
0.9132
0.0831
0.8086
0.8733
14.583
0.9114
0.9589
18.6433
0.8897
0.5941
0.9296
0.6782
0.9292
0.9535
0.6741
0.7405
0.7753
0.8037
0.833
0.9132
0.882
0.8542
0.825
0.944
0.2088
0.3173
0.8152
0.6133
0.0435
0.4254
0.0265
0.0583
0.8905
0.8562
17.6223
0.8523
0.9087
12.2918
0.7566
0.6785
3.0786
20.1409
8.3149
0.0831
17.9211
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Avengers-V3-70B
0.356
0.2088
0.2404
0.1612
0.0171
0.858
0.006
0.8183
0.6584
0.3086
0.5562
0.0831
0.4861
0.8657
14.1464
0.898
0.9576
18.14
0.8879
0.2404
0.8695
0.569
0.7792
0.933
0.2291
0
0.4741
0.6515
0.8181
0.5562
0.8727
0.8406
0.7716
0.006
0.2088
0.3173
0.0342
0.2105
0.0036
0.0108
0
0.0008
0.791
0.7825
13.6192
0.7498
0.8992
11.6258
0.7375
0.6785
3.0786
20.1409
8.3149
0.0831
17.9211
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Advanced-V12-70B
0.6667
0.5763
0.5685
0.314
0.7433
0.8917
0.938
0.8363
0.7836
0.6989
0.9239
0.059
0.8605
0.8615
13.9287
0.8991
0.9586
18.7964
0.8891
0.5685
0.894
0.6437
0.8764
0.9517
0.5769
0.7162
0.7523
0.7942
0.8514
0.9239
0.8992
0.8721
0.8294
0.938
0.5763
0.9819
0.7704
0.6592
0.0852
0.4464
0.0885
0.0648
0.8851
0.8303
15.8592
0.7897
0.9161
12.8808
0.7672
0.6351
3.3978
16.1526
5.8881
0.059
13.6912
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Advanced-V12-70B
0.6064
0.5763
0.3432
0.1637
0.7121
0.872
0.866
0.8288
0.7754
0.5642
0.9101
0.059
0.7172
0.8491
12.7037
0.8824
0.9582
17.8773
0.8884
0.3432
0.9128
0.7098
0.7639
0.9392
0.4718
0.6851
0.765
0.8049
0.8334
0.9101
0.8922
0.8639
0.7641
0.866
0.5763
0.9819
0.739
0.5036
0.001
0.0002
0.0354
0.0045
0.7773
0.8165
12.4108
0.7876
0.9072
11.6113
0.7569
0.6351
3.3978
16.1526
5.8881
0.059
13.6912
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Avengers-V4-70B
0.3836
0.5863
0.2707
0.1754
0.094
0.7212
0.026
0.8462
0.2801
0.4569
0.6349
0.1278
0.5354
0.8701
14.8228
0.8998
0.96
18.8176
0.8919
0.2707
0.4241
0.5718
0.0292
0.9446
0.4527
0.1124
0.3209
0.0417
0.437
0.6349
0.8969
0.8645
0.7949
0.026
0.5863
0.9357
0.0756
0.3826
0.0037
0.009
0.0147
0.0093
0.8401
0.8383
13.8494
0.8393
0.9037
11.664
0.7539
0.7123
4.1077
28.7565
12.7821
0.1278
25.4036
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Avengers-V4-70B
0.6969
0.5863
0.6246
0.3345
0.7633
0.9178
0.954
0.8623
0.8123
0.7601
0.9228
0.1278
0.882
0.879
15.5348
0.917
0.961
19.626
0.8933
0.6246
0.9369
0.6839
0.8819
0.9598
0.7126
0.7343
0.857
0.8068
0.8319
0.9228
0.9014
0.8727
0.8568
0.954
0.5863
0.9357
0.7923
0.6856
0.1257
0.5124
0.0531
0.0725
0.9087
0.8684
18.3481
0.8676
0.9148
12.8181
0.7711
0.7123
4.1077
28.7565
12.7821
0.1278
25.4036
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Base-32B
0.5157
0.5663
0.2932
0.1335
0.0549
0.8426
0.676
0.8423
0.6642
0.4421
0.8718
0.2855
0.5525
0.8642
13.0457
0.9069
0.9547
17.5592
0.8849
0.2932
0.8785
0.5431
0.6639
0.9276
0.4123
0.0892
0.673
0.7759
0.6649
0.8718
0.8961
0.8723
0.7217
0.676
0.5663
0.8956
0.0205
0.3616
0.005
0
0.0177
0
0.6449
0.8179
9.7732
0.831
0.8997
10.1877
0.7465
0.7905
8.0023
52.7522
28.5161
0.2855
44.8456
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
0
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Base-32B
0.6663
0.5663
0.557
0.2527
0.7535
0.8908
0.922
0.8439
0.7522
0.5775
0.9276
0.2855
0.5935
0.8646
12.2397
0.9076
0.9544
16.8035
0.8828
0.557
0.9058
0.6638
0.7486
0.9526
0.621
0.7074
0.8184
0.7936
0.7368
0.9276
0.8947
0.874
0.8139
0.922
0.5663
0.8956
0.7996
0.518
0.0132
0.3395
0.0531
0.0769
0.7809
0.8355
13.5653
0.8344
0.9046
11.2487
0.7509
0.7905
8.0023
52.7522
28.5161
0.2855
44.8456
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
0
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Advanced-V1-32B
0.6373
0.243
0.5798
0.2879
0.7822
0.895
0.942
0.8464
0.8025
0.5903
0.9201
0.1215
0.6002
0.8668
12.7065
0.9101
0.9556
17.5826
0.8852
0.5798
0.8955
0.6868
0.8208
0.9634
0.6232
0.7605
0.8948
0.8043
0.806
0.9201
0.8948
0.8785
0.8262
0.942
0.243
0.3735
0.8039
0.5474
0.0141
0.3824
0.1327
0.0941
0.8161
0.8317
13.312
0.8313
0.9058
11.31
0.7592
0.7012
3.916
28.4658
12.1545
0.1215
24.9555
Qwen2ForCausalLM
bfloat16
apache-2.0
32.76
0
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Advanced-V1-32B
0.4603
0.243
0.3213
0.1156
0.4984
0.8665
0.614
0.6156
0.6427
0.2565
0.7677
0.1215
0.2848
0.7524
12.28
0.6978
0.8411
16.2831
0.5481
0.3213
0.8948
0.6092
0.2986
0.9366
0.2821
0.4346
0.7905
0.7955
0.7197
0.7677
0.8976
0.8732
0.7681
0.614
0.243
0.3735
0.5622
0.2025
0.005
0.0029
0
0.0003
0.5697
0.7149
8.8323
0.6591
0.8256
9.2275
0.5575
0.7012
3.916
28.4658
12.1545
0.1215
24.9555
Qwen2ForCausalLM
bfloat16
apache-2.0
32.76
0
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Avengers-V5-70B
0.6149
0.0863
0.5699
0.2616
0.7728
0.8824
0.934
0.8154
0.8013
0.6923
0.9044
0.044
0.8248
0.8344
12.4686
0.8334
0.9588
18.3956
0.8891
0.5699
0.8963
0.6983
0.9208
0.9455
0.6398
0.7334
0.7317
0.8056
0.85
0.9044
0.8699
0.8476
0.8054
0.934
0.0863
0.1807
0.8121
0.6124
0.0346
0.345
0.0177
0.0226
0.8882
0.8266
15.8059
0.7868
0.9082
12.2544
0.7523
0.6213
2.8907
13.3272
4.401
0.044
11.0807
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Avengers-V5-70B
0.3734
0.0863
0.2253
0.1609
0.0069
0.8675
0.12
0.7964
0.7627
0.293
0.7438
0.044
0.4499
0.8349
12.4544
0.8432
0.9514
17.8958
0.8705
0.2253
0.9196
0.6983
0.8111
0.924
0.2031
0.0025
0.7231
0.7071
0.874
0.7438
0.486
0.8185
0.7589
0.12
0.0863
0.1807
0.0112
0.2261
0.0029
0.0038
0
0
0.798
0.7882
12.6282
0.7388
0.8991
11.5296
0.7331
0.6213
2.8907
13.3272
4.401
0.044
11.0807
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Avengers-V1-32B
0.5176
0.247
0.1557
0.1176
0.6723
0.8734
0.806
0.665
0.7554
0.4004
0.8986
0.1022
0.4216
0.759
11.1519
0.7158
0.8589
15.634
0.6042
0.1557
0.9036
0.6322
0.7708
0.9339
0.3548
0.6498
0.8233
0.7973
0.7534
0.8986
0.8944
0.8735
0.7829
0.806
0.247
0.3936
0.6948
0.4247
0.0231
0.0087
0
0.0056
0.5504
0.7323
8.6672
0.6918
0.8602
9.1539
0.6481
0.6857
2.8962
26.3302
10.2219
0.1022
22.8557
Qwen2ForCausalLM
bfloat16
apache-2.0
32.76
0
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Avengers-V1-32B
0.6299
0.247
0.5794
0.2816
0.7825
0.8964
0.938
0.8176
0.8161
0.5602
0.9082
0.1022
0.5696
0.8335
13.1786
0.8529
0.9558
17.6265
0.8865
0.5794
0.8985
0.7098
0.8361
0.958
0.5943
0.7616
0.8965
0.7948
0.8435
0.9082
0.8956
0.8815
0.8326
0.938
0.247
0.3936
0.8034
0.5167
0.0495
0.359
0.0885
0.1157
0.7954
0.7876
11.8516
0.7667
0.9074
11.5104
0.7642
0.6857
2.8962
26.3302
10.2219
0.1022
22.8557
Qwen2ForCausalLM
bfloat16
apache-2.0
32.76
0
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Avengers-V2-32B
0.5006
0.1747
0.155
0.1135
0.6194
0.8733
0.78
0.6548
0.7477
0.3892
0.8958
0.1032
0.4303
0.758
11.1964
0.712
0.8551
15.4535
0.596
0.155
0.9023
0.6351
0.7278
0.9339
0.3322
0.6154
0.8221
0.7999
0.7536
0.8958
0.8941
0.8736
0.7836
0.78
0.1747
0.2651
0.6234
0.4052
0.0148
0.007
0
0.0032
0.5424
0.7297
8.5653
0.6868
0.851
9.0423
0.6243
0.6852
2.9647
26.6586
10.3239
0.1032
23.1535
Qwen2ForCausalLM
bfloat16
apache-2.0
32.76
0
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Avengers-V2-32B
0.6209
0.1747
0.5732
0.2788
0.7805
0.8959
0.938
0.8064
0.8166
0.555
0.9081
0.1032
0.5684
0.8225
13.1622
0.8321
0.9553
17.7008
0.8851
0.5732
0.896
0.7069
0.8375
0.9589
0.5918
0.7591
0.8965
0.7898
0.8522
0.9081
0.8929
0.8818
0.8329
0.938
0.1747
0.2651
0.8018
0.5046
0.0518
0.3492
0.0973
0.1143
0.7816
0.7743
11.6422
0.7447
0.9072
11.3013
0.7637
0.6852
2.9647
26.6586
10.3239
0.1032
23.1535
Qwen2ForCausalLM
bfloat16
apache-2.0
32.76
0
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Avengers-V3-32B
0.5058
0.2912
0.1058
0.1206
0.5718
0.8745
0.788
0.6894
0.7641
0.3788
0.8848
0.0951
0.4338
0.764
10.9772
0.7325
0.8699
15.1981
0.6626
0.1058
0.9038
0.6494
0.7903
0.9312
0.2662
0.5592
0.8196
0.7986
0.7627
0.8848
0.8947
0.876
0.7887
0.788
0.2912
0.492
0.5844
0.4364
0.0281
0.0056
0
0.0044
0.5646
0.7335
8.5283
0.7006
0.8629
9.1587
0.6619
0.6875
2.6923
25.3725
9.5144
0.0951
22.17
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
0
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Avengers-V3-32B
0.6324
0.2912
0.5839
0.2723
0.7754
0.8962
0.944
0.8432
0.8099
0.5398
0.9049
0.0951
0.5517
0.8641
13.4331
0.9075
0.9551
17.6896
0.8853
0.5839
0.898
0.6753
0.8403
0.9571
0.5672
0.7512
0.8965
0.7809
0.8563
0.9049
0.8906
0.8792
0.8335
0.944
0.2912
0.492
0.7996
0.5004
0.0552
0.3803
0
0.1165
0.8096
0.8235
11.0257
0.8296
0.9017
11.1475
0.7503
0.6875
2.6923
25.3725
9.5144
0.0951
22.17
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
0
main
4
False
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
tokyotech-llm/Llama-3.1-Swallow-70B-Instruct-v0.3
0.4889
0
0.4568
0.1735
0.5205
0.8181
0.736
0.7427
0.6764
0.4459
0.6945
0.1137
0.373
0.858
12.9928
0.911
0.8897
16.7576
0.6673
0.4568
0.7583
0.6034
0.7139
0.9473
0.4644
0.4583
0.546
0.7456
0.7733
0.6945
0.8905
0.8578
0.7488
0.736
0
0
0.5826
0.5003
0.0147
0.0119
0.0088
0.0136
0.8187
0.8068
10.0695
0.8231
0.8487
10.4706
0.5692
0.7024
3.5848
28.9326
11.3782
0.1137
25.3589
LlamaForCausalLM
bfloat16
llama3.1;gemma
70.554
6
main
0
False
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
tokyotech-llm/Llama-3.1-Swallow-70B-Instruct-v0.3
0.6208
0
0.5877
0.3265
0.7107
0.9023
0.94
0.859
0.7648
0.7126
0.911
0.1137
0.8592
0.8718
14.4087
0.9156
0.9589
18.902
0.8913
0.5877
0.9176
0.6494
0.7986
0.9643
0.6303
0.6851
0.825
0.7879
0.7629
0.911
0.9022
0.8677
0.8251
0.94
0
0
0.7362
0.6482
0.0774
0.4797
0.0619
0.1101
0.9034
0.8512
13.1098
0.8579
0.9142
12.2545
0.7713
0.7024
3.5848
28.9326
11.3782
0.1137
25.3589
LlamaForCausalLM
bfloat16
llama3.1;gemma
70.554
6
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Avengers-V4-32B
0.4838
0.008
0.1455
0.12
0.6601
0.8714
0.82
0.6421
0.7427
0.3368
0.8703
0.1045
0.298
0.7524
10.7871
0.7002
0.8516
15.6005
0.5858
0.1455
0.9023
0.6207
0.7083
0.9339
0.3785
0.6007
0.8394
0.798
0.7471
0.8703
0.8907
0.8715
0.778
0.82
0.008
0.0161
0.7196
0.334
0.0058
0.0024
0
0.0064
0.5853
0.7242
8.8307
0.6758
0.8424
8.9859
0.6066
0.6872
3.1771
26.7645
10.4431
0.1045
23.3063
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
1
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Avengers-V4-32B
0.6106
0.008
0.5759
0.2869
0.7833
0.8965
0.936
0.821
0.8244
0.5672
0.9135
0.1045
0.578
0.8614
13.1877
0.9021
0.9527
17.7238
0.877
0.5759
0.9026
0.7356
0.8542
0.9571
0.591
0.7628
0.8981
0.8005
0.8334
0.9135
0.888
0.8779
0.8298
0.936
0.008
0.0161
0.8037
0.5324
0.0455
0.3418
0.1504
0.105
0.7916
0.7888
11.769
0.7714
0.8964
11.1378
0.7332
0.6872
3.1771
26.7645
10.4431
0.1045
23.3063
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
1
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Pro-V7-70B
0.6332
0.002
0.6168
0.3476
0.7359
0.9189
0.946
0.8634
0.7843
0.7465
0.9123
0.0912
0.8974
0.8785
15.0597
0.9182
0.961
19.426
0.8941
0.6168
0.9354
0.6437
0.8208
0.9678
0.657
0.7148
0.8435
0.7986
0.8147
0.9123
0.9096
0.8812
0.8536
0.946
0.002
0.002
0.7571
0.6852
0.1256
0.5263
0.0708
0.1016
0.9136
0.8633
15.3884
0.8664
0.916
12.5452
0.7747
0.6801
3.8393
21.2279
9.1197
0.0912
18.9384
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Pro-V7-70B
0.304
0.002
0.2899
0.179
0.2127
0.578
0.006
0.848
0.2658
0.3961
0.4749
0.0912
0.503
0.8652
14.3057
0.9132
0.9591
18.5229
0.892
0.2899
0.0003
0.5575
0.0444
0.9544
0.3019
0.0805
0.3636
0
0.3635
0.4749
0.8939
0.865
0.7794
0.006
0.002
0.002
0.345
0.3833
0.0031
0.0265
0.0177
0.0201
0.8276
0.8112
10.5794
0.8266
0.9066
11.6273
0.7602
0.6801
3.8393
21.2279
9.1197
0.0912
18.9384
LlamaForCausalLM
bfloat16
apache-2.0
70.554
0
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Pro-V8-70B
0.3595
0.002
0.3125
0.19
0.2773
0.5809
0.114
0.8548
0.2638
0.5403
0.7077
0.1108
0.8049
0.8664
14.4057
0.9143
0.9591
18.4793
0.892
0.3125
0.0073
0.5575
0.0389
0.9553
0.3482
0.2149
0.3599
0
0.3629
0.7077
0.8951
0.8661
0.7802
0.114
0.002
0.006
0.3397
0.4678
0.0024
0.027
0.0177
0.0168
0.8859
0.8285
10.7182
0.8482
0.9084
11.6781
0.7645
0.6993
3.9237
25.8447
11.071
0.1108
22.8914
LlamaForCausalLM
bfloat16
apache-2.0
70.554
1
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Japanese-Pro-V8-70B
0.635
0.002
0.6174
0.3483
0.7358
0.919
0.944
0.8634
0.7854
0.7464
0.912
0.1108
0.8966
0.8787
15.0616
0.9182
0.961
19.5028
0.8941
0.6174
0.9344
0.6437
0.8278
0.9687
0.6559
0.7148
0.841
0.798
0.8165
0.912
0.9102
0.8824
0.8539
0.944
0.002
0.006
0.7568
0.6868
0.1236
0.534
0.0708
0.101
0.9123
0.8631
15.3199
0.8661
0.9162
12.5576
0.7751
0.6993
3.9237
25.8447
11.071
0.1108
22.8914
LlamaForCausalLM
bfloat16
apache-2.0
70.554
1
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Avengers-V6-32B
0.5067
0.2972
0.1068
0.122
0.5735
0.8744
0.784
0.6893
0.7659
0.3782
0.8869
0.0954
0.4336
0.7633
11.0622
0.7294
0.8708
15.2435
0.6656
0.1068
0.9041
0.6523
0.7944
0.9321
0.2662
0.5609
0.8209
0.7992
0.7625
0.8869
0.8949
0.8764
0.7871
0.784
0.2972
0.498
0.5862
0.4348
0.0281
0.0064
0
0.0054
0.57
0.7339
8.5598
0.7009
0.863
9.219
0.6613
0.6873
2.8057
25.4218
9.5439
0.0954
22.1584
Qwen2ForCausalLM
bfloat16
apache-2.0
32.76
1
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Avengers-V6-32B
0.6336
0.2972
0.5881
0.2718
0.7762
0.8968
0.942
0.8432
0.8117
0.5426
0.9042
0.0954
0.5559
0.8641
13.299
0.9073
0.955
17.5951
0.8852
0.5881
0.8985
0.6782
0.8431
0.958
0.5672
0.7518
0.8969
0.7822
0.8581
0.9042
0.8904
0.8778
0.8337
0.942
0.2972
0.498
0.8006
0.5047
0.0523
0.3853
0
0.1147
0.8066
0.8235
11.0258
0.8292
0.9018
11.0763
0.7512
0.6873
2.8057
25.4218
9.5439
0.0954
22.1584
Qwen2ForCausalLM
bfloat16
apache-2.0
32.76
1
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Avengers-V5-32B
0.4011
0.002
0.1298
0.1194
0.477
0.8707
0.524
0.6263
0.5804
0.2547
0.7214
0.1058
0.2886
0.7502
10.8268
0.6958
0.846
15.5952
0.5711
0.1298
0.8958
0.6264
0.2167
0.9374
0.2616
0.4058
0.8007
0.6427
0.6154
0.7214
0.8917
0.8732
0.7789
0.524
0.002
0.006
0.5481
0.2139
0.0058
0.0019
0
0.0046
0.5848
0.7171
8.9202
0.6632
0.8294
8.8016
0.5751
0.6874
3.1338
27.0943
10.5826
0.1058
23.589
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
1
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Saxo/Linkbricks-Horizon-AI-Avengers-V5-32B
0.6104
0.002
0.5787
0.2878
0.7826
0.8968
0.934
0.8179
0.8234
0.5708
0.9149
0.1058
0.5785
0.8609
12.9052
0.9024
0.951
17.687
0.8721
0.5787
0.9023
0.7328
0.8542
0.958
0.5923
0.7628
0.8998
0.7992
0.8311
0.9149
0.887
0.8764
0.83
0.934
0.002
0.006
0.8024
0.5415
0.0511
0.3396
0.1504
0.1006
0.7972
0.7916
11.7773
0.7767
0.8914
11.0859
0.7204
0.6874
3.1338
27.0943
10.5826
0.1058
23.589
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
1
main
4
False
v1.4.1
v0.6.3.post1
🤝 : base merges and moerges
sometimesanotion/Qwen2.5-14B-Vimarckoso-v3
0.5593
0.6225
0.1956
0.1489
0.6996
0.8547
0.824
0.8306
0.7402
0.3182
0.8167
0.1018
0.3472
0.8382
8.9138
0.8897
0.951
15.4776
0.8808
0.1956
0.8955
0.6466
0.8139
0.933
0.2903
0.667
0.7465
0.6622
0.8317
0.8167
0.8881
0.8567
0.7357
0.824
0.6225
0.9578
0.7322
0.317
0.0215
0.0138
0.0177
0.0032
0.6881
0.7882
7.5397
0.8048
0.8953
9.2972
0.7471
0.6958
2.6661
26.5851
10.1962
0.1018
23.3302
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
10
main
0
False
v1.4.1
v0.6.3.post1
🤝 : base merges and moerges
sometimesanotion/Qwen2.5-14B-Vimarckoso-v3
0.6404
0.6225
0.5708
0.2674
0.7343
0.8771
0.89
0.8416
0.7573
0.4963
0.8856
0.1018
0.4873
0.8571
12.2369
0.9025
0.9517
16.3534
0.8806
0.5708
0.8945
0.5776
0.7986
0.95
0.5645
0.7057
0.8513
0.7563
0.8027
0.8856
0.8756
0.8574
0.7869
0.89
0.6225
0.9578
0.7629
0.4371
0.084
0.3804
0.0265
0.0729
0.7732
0.8176
9.7327
0.8274
0.9022
10.4444
0.7558
0.6958
2.6661
26.5851
10.1962
0.1018
23.3302
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
10
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Manual-Dataset-Creation-Project/Matsu-7B
0.3435
0.0261
0.1212
0.0373
0.2867
0.6752
0.44
0.701
0.3863
0.214
0.8118
0.0787
0.2816
0.7696
9.1813
0.8016
0.8612
8.5945
0.7333
0.1212
0.7508
0.204
0.4653
0.7069
0.0841
0.379
0.226
0.6995
0.3367
0.8118
0.8038
0.8061
0.568
0.44
0.0261
0.0422
0.1945
0.2764
0
0.0007
0
0
0.1858
0.7055
6.3279
0.6865
0.8025
3.6906
0.5824
0.6765
2.4066
19.8323
7.8745
0.0787
17.4875
Qwen2ForCausalLM
bfloat16
apache-2.0
7.616
2
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Manual-Dataset-Creation-Project/Matsu-7B
0.5238
0.0261
0.4153
0.1804
0.6571
0.8217
0.784
0.8043
0.6852
0.4208
0.8877
0.0787
0.3878
0.8481
10.8584
0.8907
0.9401
13.2931
0.8654
0.4153
0.8357
0.4971
0.7472
0.9124
0.4838
0.63
0.6286
0.7361
0.8169
0.8877
0.8603
0.8236
0.7169
0.784
0.0261
0.0422
0.6842
0.3909
0.014
0.2513
0.0354
0.057
0.5441
0.7784
7.8446
0.7727
0.8648
7.1517
0.6886
0.6765
2.4066
19.8323
7.8745
0.0787
17.4875
Qwen2ForCausalLM
bfloat16
apache-2.0
7.616
2
main
4
False
v1.4.1
v0.6.3.post1
🤝 : base merges and moerges
spow12/ChatWaifu_v1.4
0.4754
0.508
0.3171
0.1449
0.3521
0.6718
0.51
0.8338
0.6296
0.3408
0.8162
0.1051
0.4027
0.8408
10.7853
0.9015
0.9489
14.9988
0.878
0.3171
0.7715
0.4253
0.6833
0.7185
0.2317
0.3595
0.576
0.7121
0.7514
0.8162
0.8538
0.8131
0.5253
0.51
0.508
0.9217
0.3447
0.388
0.015
0.005
0.0088
0.0051
0.6907
0.7927
8.3019
0.8188
0.8931
9.0605
0.737
0.6975
2.5452
28.6236
10.5019
0.1051
24.7426
MistralForCausalLM
bfloat16
cc-by-nc-4.0
12.248
17
main
0
False
v1.4.1
v0.6.3.post1
🤝 : base merges and moerges
spow12/ChatWaifu_v1.4
0.5942
0.508
0.5225
0.2633
0.5926
0.8493
0.752
0.844
0.677
0.5166
0.9057
0.1051
0.5455
0.8588
11.816
0.9072
0.9512
16.2055
0.8805
0.5225
0.892
0.4971
0.75
0.9312
0.5148
0.5476
0.7383
0.767
0.6324
0.9057
0.8592
0.8323
0.7247
0.752
0.508
0.9217
0.6375
0.4894
0.0299
0.3734
0.0531
0.084
0.7762
0.8236
10.4603
0.835
0.9018
9.9501
0.7535
0.6975
2.5452
28.6236
10.5019
0.1051
24.7426
MistralForCausalLM
bfloat16
cc-by-nc-4.0
12.248
17
main
4
False
v1.4.1
v0.6.3.post1
🤝 : base merges and moerges
sometimesanotion/Lamarck-14B-v0.6
0.5635
0.6325
0.2087
0.1495
0.6961
0.8548
0.816
0.8361
0.7384
0.3267
0.8372
0.1024
0.3895
0.8448
9.9323
0.8972
0.9522
15.8286
0.8826
0.2087
0.8945
0.6322
0.8153
0.9321
0.2797
0.6687
0.7473
0.6648
0.8326
0.8372
0.8874
0.8561
0.7376
0.816
0.6325
0.9639
0.7235
0.3111
0.0234
0.0121
0.0088
0.001
0.702
0.7955
7.8447
0.8157
0.8964
9.3314
0.749
0.6958
2.6458
26.9372
10.2555
0.1024
23.5995
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
11
main
0
False
v1.4.1
v0.6.3.post1
🤝 : base merges and moerges
sometimesanotion/Lamarck-14B-v0.6
0.6466
0.6325
0.5781
0.2665
0.7344
0.8795
0.892
0.8456
0.7643
0.5215
0.8953
0.1024
0.5032
0.8619
12.5569
0.9069
0.9529
16.5154
0.8825
0.5781
0.898
0.5977
0.7986
0.9508
0.5904
0.704
0.8583
0.7689
0.7981
0.8953
0.8809
0.8587
0.7897
0.892
0.6325
0.9639
0.7648
0.4709
0.0791
0.376
0.0177
0.0775
0.7821
0.8235
10.4555
0.8351
0.9041
10.7219
0.7578
0.6958
2.6458
26.9372
10.2555
0.1024
23.5995
Qwen2ForCausalLM
bfloat16
apache-2.0
14.766
11
main
4
False
v1.4.1
v0.6.3.post1
🤝 : base merges and moerges
suayptalha/HomerCreativeAnvita-Mix-Qw7B
0.3759
0.0161
0.2308
0.0861
0.6237
0.5273
0.536
0.6276
0.4286
0.2719
0.7027
0.0841
0.2947
0.6942
7.0719
0.7269
0.8218
12.2063
0.5484
0.2308
0.0927
0.5345
0.0667
0.8338
0.2328
0.5863
0.742
0.0587
0.7412
0.7027
0.8812
0.8481
0.6555
0.536
0.0161
0.002
0.6612
0.2883
0
0.0216
0.0016
0
0.4071
0.673
6.7359
0.6785
0.8131
7.5441
0.5565
0.6801
2.2978
23.7702
8.4118
0.0841
19.6591
Qwen2ForCausalLM
bfloat16
apache-2.0
7.616
9
main
0
False
v1.4.1
v0.6.3.post1
🤝 : base merges and moerges
suayptalha/HomerCreativeAnvita-Mix-Qw7B
0.5377
0.0161
0.4782
0.1861
0.6659
0.8452
0.798
0.7725
0.739
0.4335
0.8966
0.0841
0.3899
0.8139
11.5906
0.8003
0.9503
15.7431
0.8789
0.4782
0.8632
0.6034
0.7458
0.9187
0.4803
0.6303
0.841
0.714
0.7909
0.8966
0.8674
0.8432
0.7537
0.798
0.0161
0.002
0.7015
0.4302
0.0292
0.3386
0.0531
0.077
0.4326
0.7675
8.7182
0.7087
0.8814
9.5983
0.702
0.6801
2.2978
23.7702
8.4118
0.0841
19.6591
Qwen2ForCausalLM
bfloat16
apache-2.0
7.616
9
main
4
False
v1.4.1
v0.6.3.post1
🤝 : base merges and moerges
gmonsoon/SahabatAI-Lion-9B-TIES-v1
0.44
0.0201
0.1419
0.0754
0.4923
0.7073
0.648
0.8061
0.6756
0.3222
0.8534
0.0976
0.2744
0.8402
11.0065
0.879
0.9474
15.1281
0.869
0.1419
0.7189
0.523
0.6611
0.8248
0.3609
0.5044
0.6738
0.7235
0.7966
0.8534
0.8482
0.8206
0.5782
0.648
0.0201
0.0341
0.4803
0.3314
0.005
0.0054
0
0
0.3664
0.7714
7.8703
0.7686
0.8855
9.4081
0.7077
0.6917
2.6712
24.9743
9.7479
0.0976
21.5186
Gemma2ForCausalLM
bfloat16
gemma
9.242
1
main
0
False
v1.4.1
v0.6.3.post1
🤝 : base merges and moerges
gmonsoon/SahabatAI-Lion-9B-TIES-v1
0.5504
0.0201
0.4889
0.2311
0.6297
0.8392
0.77
0.8405
0.7093
0.5207
0.9073
0.0976
0.5317
0.8649
12.4678
0.8992
0.9528
17.3388
0.8787
0.4889
0.8783
0.5632
0.7542
0.9026
0.5667
0.5803
0.7753
0.738
0.7159
0.9073
0.8537
0.8332
0.7367
0.77
0.0201
0.0341
0.679
0.4638
0.0203
0.2846
0.0619
0.0783
0.7101
0.8263
10.6118
0.8296
0.9077
11.5902
0.7544
0.6917
2.6712
24.9743
9.7479
0.0976
21.5186
Gemma2ForCausalLM
bfloat16
gemma
9.242
1
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
jpacifico/Chocolatine-3B-Instruct-DPO-Revised
0.4489
0.0542
0.4146
0.1739
0.547
0.6296
0.65
0.776
0.5634
0.2158
0.8302
0.0838
0.1751
0.81
8.0379
0.8204
0.9253
11.5538
0.8266
0.4146
0.7202
0.5287
0.6778
0.7319
0.2232
0.4225
0.4166
0.6818
0.5123
0.8302
0.8175
0.7876
0.4368
0.65
0.0542
0.1627
0.6715
0.249
0.0194
0.2741
0.0531
0.0188
0.5041
0.7629
6.7004
0.7523
0.8828
8.6659
0.7046
0.6918
2.4009
27.7212
8.3748
0.0838
23.1551
Phi3ForCausalLM
float16
mit
3.821
26
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
jpacifico/Chocolatine-3B-Instruct-DPO-Revised
0.3605
0.0542
0.0639
0.0785
0.5003
0.526
0.412
0.7707
0.5404
0.1567
0.779
0.0842
0.1372
0.7903
6.7533
0.8158
0.9237
10.3267
0.8275
0.0639
0.6485
0.4799
0.5736
0.5416
0.1498
0.3711
0.4951
0.7159
0.4376
0.779
0.7851
0.7626
0.3879
0.412
0.0542
0.1627
0.6295
0.183
0.0063
0
0
0
0.3864
0.7409
5.9775
0.7416
0.8762
7.8029
0.6978
0.6914
2.2941
27.6478
8.4123
0.0842
23.1443
Phi3ForCausalLM
float16
mit
3.821
26
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
fblgit/pancho-v1-qw25-3B-UNAMGS
0.317
0.4157
0
0.0662
0.4485
0.4566
0.464
0.6081
0.2547
0.1679
0.5266
0.0789
0.2092
0.7697
6.3386
0.791
0.7951
10.89
0.4764
0
0.0035
0.3822
0.3889
0.7936
0.0665
0.4663
0.1549
0.19
0.1573
0.5266
0.8597
0.8275
0.5726
0.464
0.4157
0.8092
0.4308
0.2279
0.0042
0.0006
0
0
0.3262
0.7145
4.7989
0.7056
0.7862
6.7565
0.4593
0.676
1.5908
21.7877
7.8934
0.0789
19.009
Qwen2ForCausalLM
bfloat16
other
3.397
1
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
fblgit/pancho-v1-qw25-3B-UNAMGS
0.5287
0.4157
0.4772
0.1542
0.5709
0.7762
0.692
0.7924
0.6657
0.3234
0.8688
0.0789
0.2797
0.8309
9.5317
0.8634
0.9259
13.7424
0.8209
0.4772
0.8016
0.4885
0.6153
0.8624
0.3735
0.5312
0.797
0.7052
0.7225
0.8688
0.8248
0.7822
0.6646
0.692
0.4157
0.8092
0.6105
0.3171
0.0365
0.1747
0.0354
0.0273
0.4972
0.7752
7.0795
0.7778
0.8821
8.8906
0.7075
0.676
1.5908
21.7877
7.8934
0.0789
19.009
Qwen2ForCausalLM
bfloat16
other
3.397
1
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
MaziyarPanahi/calme-3.2-instruct-3b
0.5203
0.3735
0.4585
0.1526
0.5827
0.7774
0.676
0.8052
0.663
0.2922
0.8714
0.0712
0.1789
0.822
9.4746
0.8467
0.9429
13.7067
0.8658
0.4585
0.8049
0.454
0.6139
0.8794
0.437
0.5453
0.7675
0.7153
0.7642
0.8714
0.8495
0.8122
0.6479
0.676
0.3735
0.7149
0.6201
0.2607
0.0331
0.1593
0.0708
0.0516
0.4482
0.7816
7.1901
0.7856
0.8874
8.9582
0.7227
0.6681
1.7136
19.283
7.1288
0.0712
16.8948
Qwen2ForCausalLM
bfloat16
other
3.086
1
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
MaziyarPanahi/calme-3.2-instruct-3b
0.2642
0.3735
0
0.0565
0.2504
0.4603
0.37
0.7198
0.2203
0.152
0.2321
0.0712
0.1275
0.7969
7.9224
0.8314
0.8682
11.8686
0.6798
0
0.001
0.2586
0.55
0.8079
0.1432
0.3132
0.092
0.0013
0.1997
0.2321
0.8294
0.8033
0.572
0.37
0.3735
0.7149
0.1877
0.1855
0.0027
0
0
0
0.2797
0.7439
5.597
0.751
0.8389
6.9303
0.617
0.6681
1.7136
19.283
7.1288
0.0712
16.8948
Qwen2ForCausalLM
bfloat16
other
3.086
1
main
0
False
v1.4.1
v0.6.3.post1
🤝 : base merges and moerges
nitky/EZO-QwQ-32B-Preview
0.6262
0.2209
0.5788
0.2779
0.7824
0.8962
0.936
0.8068
0.8215
0.5599
0.9069
0.1009
0.5717
0.8339
12.864
0.8522
0.9559
17.9405
0.8869
0.5788
0.8993
0.7155
0.85
0.9589
0.5989
0.7614
0.8956
0.7986
0.8476
0.9069
0.8969
0.8826
0.8305
0.936
0.2209
0.3333
0.8035
0.5091
0.0417
0.3366
0.1327
0.1067
0.7716
0.7618
11.3881
0.7236
0.9069
11.3889
0.7645
0.6846
3.0527
25.7613
10.081
0.1009
22.4677
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
4
main
4
False
v1.4.1
v0.6.3.post1
🤝 : base merges and moerges
nitky/EZO-QwQ-32B-Preview
0.5199
0.2209
0.1672
0.1214
0.7085
0.8741
0.798
0.6628
0.7612
0.4061
0.8975
0.1009
0.4156
0.7589
11.0695
0.7147
0.8587
15.5642
0.6057
0.1672
0.9038
0.6379
0.7694
0.9366
0.3747
0.6792
0.834
0.7992
0.7656
0.8975
0.897
0.8749
0.7819
0.798
0.2209
0.3333
0.7379
0.4281
0.0142
0.0098
0
0.0046
0.5786
0.7296
8.6853
0.6875
0.8576
9.132
0.6433
0.6846
3.0527
25.7613
10.081
0.1009
22.4677
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
4
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
rombodawg/Rombos-Coder-V2.5-Qwen-14b
0.5406
0
0.5221
0.2527
0.6267
0.7837
0.864
0.825
0.7111
0.3756
0.9063
0.0793
0.308
0.8491
11.5286
0.8849
0.9461
14.8937
0.8698
0.5221
0.8186
0.5431
0.7903
0.8722
0.4538
0.5767
0.7786
0.7077
0.7359
0.9063
0.8694
0.8325
0.6603
0.864
0
0
0.6767
0.3648
0.0565
0.3208
0.0708
0.1144
0.7012
0.7996
8.9882
0.8084
0.8974
10.1685
0.7368
0.6625
1.4772
21.6988
7.9327
0.0793
18.2787
Qwen2ForCausalLM
bfloat16
apache-2.0
14.77
1
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
rombodawg/Rombos-Coder-V2.5-Qwen-14b
0.2371
0
0.1837
0.11
0.1817
0.1237
0.146
0.6565
0.2595
0.1901
0.6771
0.0793
0.1174
0.7561
4.3099
0.7585
0.8808
8.6742
0.6569
0.1837
0.3685
0.2471
0.5
0.0027
0.2694
0.2759
0.1545
0.1679
0.2277
0.6771
0.5086
0.5101
0
0.146
0
0
0.0875
0.1836
0.0167
0.0002
0.0177
0.01
0.5057
0.6971
3.5073
0.6687
0.8383
6.0672
0.5419
0.6625
1.4772
21.6988
7.9327
0.0793
18.2787
Qwen2ForCausalLM
bfloat16
apache-2.0
14.77
1
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Qwen/Qwen2.5-Coder-14B-Instruct
0.2006
0
0.1894
0.1087
0.1164
0.0054
0.048
0.6545
0.2569
0.1798
0.5682
0.0791
0.1108
0.7538
4.1832
0.7578
0.879
8.6853
0.6535
0.1894
0.0145
0.25
0.5
0.0018
0.2602
0.1951
0.1491
0.1679
0.2176
0.5682
0.0077
0.0279
0
0.048
0
0
0.0376
0.1683
0.0167
0.0002
0.0177
0.0066
0.5023
0.6952
3.4913
0.6643
0.8387
6.0968
0.5426
0.6634
1.5093
21.754
7.9106
0.0791
18.3111
Qwen2ForCausalLM
bfloat16
apache-2.0
14.77
72
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Qwen/Qwen2.5-Coder-14B-Instruct
0.5408
0
0.5232
0.2527
0.628
0.7844
0.864
0.8251
0.7125
0.3735
0.9067
0.0791
0.3071
0.8496
11.4861
0.8862
0.9462
14.9986
0.8696
0.5232
0.8194
0.5489
0.7917
0.8731
0.4466
0.5795
0.781
0.7058
0.7351
0.9067
0.8714
0.8354
0.6606
0.864
0
0
0.6765
0.3667
0.0592
0.3227
0.0619
0.1146
0.705
0.7995
8.9909
0.8077
0.8973
10.1774
0.737
0.6634
1.5093
21.754
7.9106
0.0791
18.3111
Qwen2ForCausalLM
bfloat16
apache-2.0
14.77
72
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
rombodawg/Rombos-Coder-V2.5-Qwen-32b
0.4644
0.6807
0.2383
0.1525
0.5689
0.7849
0.334
0.8069
0.4276
0.2544
0.7858
0.0744
0.2671
0.8249
8.6177
0.8747
0.9436
13.31
0.8631
0.2383
0.8231
0.4598
0.7222
0.8749
0.2547
0.5083
0.4047
0.1957
0.3556
0.7858
0.8391
0.8259
0.6568
0.334
0.6807
0.749
0.6295
0.2414
0.0242
0.0103
0.0619
0.007
0.6591
0.7636
7.2659
0.7796
0.8868
8.8741
0.7102
0.6736
1.8657
21.7841
7.4367
0.0744
19.0279
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
9
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
rombodawg/Rombos-Coder-V2.5-Qwen-32b
0.6192
0.6807
0.545
0.2774
0.6749
0.8224
0.894
0.8349
0.7296
0.4186
0.8594
0.0744
0.3784
0.8531
11.6644
0.8947
0.9502
15.6897
0.8762
0.545
0.8427
0.6523
0.8708
0.9088
0.5112
0.6309
0.7482
0.685
0.6919
0.8594
0.8697
0.8441
0.7156
0.894
0.6807
0.749
0.7188
0.3662
0.0532
0.3661
0.1239
0.0741
0.7698
0.8104
9.7427
0.8238
0.8992
10.3307
0.7448
0.6736
1.8657
21.7841
7.4367
0.0744
19.0279
Qwen2ForCausalLM
bfloat16
apache-2.0
32.764
9
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Qwen/Qwen2.5-Coder-7B-Instruct
0.0755
0
0
0.0557
0.0014
0.0163
0.018
0.3856
0
0.0901
0.2556
0.0082
0.0903
0.6088
1.3857
0.4077
0.8065
1.9173
0.3815
0
0.0488
0
0
0
0.0594
0
0
0
0
0.2556
0.0114
0.0255
0
0.018
0
0.492
0.0028
0.1206
0
0
0
0
0.2787
0.6071
2.5098
0.4361
0.7762
0.8851
0.3173
0.5466
0.1922
2.5366
0.8143
0.0082
2.1379
Qwen2ForCausalLM
bfloat16
apache-2.0
7.616
385
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Qwen/Qwen2.5-Coder-7B-Instruct
0.4854
0
0.4651
0.1872
0.5538
0.7473
0.758
0.8082
0.6338
0.2875
0.8901
0.0082
0.252
0.8353
9.7347
0.871
0.94
13.1207
0.8594
0.4651
0.7831
0.5086
0.6625
0.8445
0.3098
0.5024
0.7707
0.7008
0.5265
0.8901
0.8159
0.7778
0.6144
0.758
0
0.492
0.6051
0.3009
0.0213
0.2461
0.0619
0.053
0.5534
0.7775
7.5379
0.7779
0.8903
9.4069
0.7243
0.5466
0.1922
2.5366
0.8143
0.0082
2.1379
Qwen2ForCausalLM
bfloat16
apache-2.0
7.616
385
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Qwen/Qwen2-VL-7B-Instruct
0.3617
0.2028
0.0021
0.0654
0.214
0.705
0.386
0.8004
0.586
0.1545
0.7855
0.0771
0.0916
0.8239
9.8156
0.8728
0.9444
14.2479
0.8686
0.0021
0.7996
0.3851
0.6639
0.7766
0.2241
0.0627
0.6109
0.6256
0.6444
0.7855
0.8223
0.7797
0.5388
0.386
0.2028
0.4016
0.3653
0.1476
0.005
0.0022
0
0
0.3199
0.7368
4.9804
0.7482
0.8826
8.1019
0.7119
0.6725
2.0749
20.2402
7.7266
0.0771
17.8381
Qwen2VLForConditionalGeneration
bfloat16
apache-2.0
8.291
1,070
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Qwen/Qwen2-VL-7B-Instruct
0.5297
0.2028
0.4999
0.1964
0.6035
0.7693
0.714
0.8139
0.6787
0.3723
0.8987
0.0771
0.3426
0.8423
11.395
0.8881
0.9463
14.616
0.8709
0.4999
0.7838
0.4282
0.7306
0.8776
0.405
0.5628
0.7436
0.7247
0.7666
0.8987
0.8678
0.8319
0.6466
0.714
0.2028
0.4016
0.6442
0.3694
0.0157
0.3206
0.0354
0.0339
0.5766
0.7716
7.6222
0.774
0.8915
9.4153
0.7227
0.6725
2.0749
20.2402
7.7266
0.0771
17.8381
Qwen2VLForConditionalGeneration
bfloat16
apache-2.0
8.291
1,070
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Sakalti/tara-3.8B
0.335
0.002
0
0.0739
0.1247
0.5925
0.474
0.7141
0.569
0.2119
0.8082
0.114
0.2058
0.8132
7.533
0.842
0.7768
1.5069
0.7088
0
0.6844
0.4741
0.5819
0.6774
0.1888
0.1206
0.5563
0.7247
0.508
0.8082
0.8422
0.8093
0.4158
0.474
0.002
0.008
0.1289
0.2411
0.0033
0
0
0
0.3659
0.7441
5.5676
0.7339
0.7748
1.9909
0.5717
0.7067
2.3017
35.3608
11.3979
0.114
28.7865
LlamaForCausalLM
float16
mit
3.821
0
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Sakalti/tara-3.8B
0.4501
0.002
0.3811
0.0973
0.5097
0.6718
0.596
0.7868
0.6325
0.2957
0.8638
0.114
0.2537
0.8274
8.6546
0.8654
0.9321
12.5338
0.8449
0.3811
0.7462
0.4914
0.5833
0.798
0.3619
0.4177
0.7128
0.7178
0.657
0.8638
0.8781
0.8408
0.471
0.596
0.002
0.008
0.6018
0.2714
0.0086
0.0692
0.0442
0.0262
0.3383
0.7552
6.4392
0.7439
0.8755
7.5868
0.693
0.7067
2.3017
35.3608
11.3979
0.114
28.7865
LlamaForCausalLM
float16
mit
3.821
0
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Sakalti/Neptuno-Alpha
0.3279
0.2229
0
0.0779
0.5352
0.6623
0.404
0.6669
0.1827
0.1413
0.6516
0.062
0.1297
0.8086
7.6523
0.8375
0.8393
12.1741
0.5617
0
0.5912
0.2471
0.5347
0.8168
0.0992
0.4979
0.0703
0
0.0615
0.6516
0.8461
0.8034
0.579
0.404
0.2229
0.4859
0.5724
0.1952
0.002
0.0008
0
0
0.3867
0.7433
5.786
0.7396
0.8136
7.4628
0.5289
0.66
1.726
17.1427
6.2011
0.062
14.732
Qwen2ForCausalLM
float16
other
3.397
1
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Sakalti/Neptuno-Alpha
0.5125
0.2229
0.4551
0.1674
0.5848
0.7664
0.69
0.814
0.6635
0.3369
0.8746
0.062
0.2841
0.8361
9.6472
0.8757
0.943
13.791
0.8641
0.4551
0.7953
0.4626
0.6153
0.8525
0.4348
0.5448
0.7662
0.7153
0.7583
0.8746
0.8587
0.8271
0.6513
0.69
0.2229
0.4859
0.6248
0.2917
0.0604
0.178
0.0796
0.0513
0.4678
0.7871
7.6238
0.7909
0.8901
8.9473
0.7256
0.66
1.726
17.1427
6.2011
0.062
14.732
Qwen2ForCausalLM
float16
other
3.397
1
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Sakalti/ultiima-32B
0.6559
0.5462
0.5811
0.2748
0.7769
0.8969
0.942
0.8479
0.8092
0.5391
0.9045
0.0962
0.5525
0.8647
13.317
0.9081
0.9552
17.6588
0.8858
0.5811
0.8988
0.6695
0.8389
0.9589
0.5672
0.7532
0.8969
0.7809
0.86
0.9045
0.8903
0.8776
0.8332
0.942
0.5462
0.7791
0.8007
0.4976
0.053
0.376
0.0025
0.1181
0.8244
0.8294
11.124
0.8396
0.9045
11.1507
0.7581
0.6916
2.8404
25.68
9.6245
0.0962
22.3663
Qwen2ForCausalLM
float16
apache-2.0
32.764
4
main
4
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Sakalti/ultiima-32B
0.5487
0.5462
0.1049
0.1479
0.5912
0.8742
0.798
0.838
0.7651
0.3872
0.8867
0.0962
0.4385
0.8483
11.371
0.9003
0.9509
15.7374
0.8789
0.1049
0.9033
0.6494
0.7944
0.933
0.2631
0.5792
0.8209
0.7961
0.7648
0.8867
0.896
0.8775
0.7863
0.798
0.5462
0.7791
0.6033
0.46
0.0281
0.0058
0.0354
0.0048
0.6651
0.8
8.6987
0.8262
0.8968
9.6136
0.7466
0.6916
2.8404
25.68
9.6245
0.0962
22.3663
Qwen2ForCausalLM
float16
apache-2.0
32.764
4
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Sakalti/Tara-3.8B-v1.1
0.335
0.002
0
0.0739
0.1247
0.5925
0.474
0.7141
0.569
0.2119
0.8082
0.114
0.2058
0.8132
7.533
0.842
0.7768
1.5069
0.7088
0
0.6844
0.4741
0.5819
0.6774
0.1888
0.1206
0.5563
0.7247
0.508
0.8082
0.8422
0.8093
0.4158
0.474
0.002
0.008
0.1289
0.2411
0.0033
0
0
0
0.3659
0.7441
5.5676
0.7339
0.7748
1.9909
0.5717
0.7067
2.3017
35.3608
11.3979
0.114
28.7865
LlamaForCausalLM
float16
mit
3.821
1
main
0
False
v1.4.1
v0.6.3.post1
🔶 : fine-tuned
Sakalti/Tara-3.8B-v1.1
0.4501
0.002
0.3811
0.0973
0.5097
0.6718
0.596
0.7868
0.6325
0.2957
0.8638
0.114
0.2537
0.8274
8.6546
0.8654
0.9321
12.5338
0.8449
0.3811
0.7462
0.4914
0.5833
0.798
0.3619
0.4177
0.7128
0.7178
0.657
0.8638
0.8781
0.8408
0.471
0.596
0.002
0.008
0.6018
0.2714
0.0086
0.0692
0.0442
0.0262
0.3383
0.7552
6.4392
0.7439
0.8755
7.5868
0.693
0.7067
2.3017
35.3608
11.3979
0.114
28.7865
LlamaForCausalLM
float16
mit
3.821
1
main
4
False
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
ibm-granite/granite-3.1-2b-instruct
0.2659
0
0.0212
0.045
0.2132
0.417
0.08
0.7722
0.4586
0.0983
0.7179
0.1018
0.0621
0.8157
8.6356
0.851
0.9326
11.8224
0.8467
0.0212
0.7322
0.3477
0.5819
0.2565
0.129
0.0604
0.2707
0.6035
0.4889
0.7179
0.6878
0.667
0.2622
0.08
0
0
0.366
0.1037
0.001
0
0
0
0.224
0.7246
4.9547
0.7119
0.8691
7.2378
0.6792
0.7025
2.4115
32.3066
10.178
0.1018
26.484
GraniteForCausalLM
bfloat16
apache-2.0
2.534
26
main
0
False
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
ibm-granite/granite-3.1-2b-instruct
0.4346
0
0.4093
0.1059
0.4579
0.6383
0.578
0.7881
0.5769
0.2693
0.8546
0.1018
0.1871
0.8298
8.4194
0.8549
0.934
12.1519
0.8498
0.4093
0.7392
0.4971
0.6125
0.7024
0.4022
0.3988
0.6771
0.6395
0.4585
0.8546
0.5843
0.5794
0.4733
0.578
0
0
0.5169
0.2187
0.0232
0.1152
0.0708
0.0347
0.2858
0.7505
6.3875
0.7357
0.8783
7.7065
0.7119
0.7025
2.4115
32.3066
10.178
0.1018
26.484
GraniteForCausalLM
bfloat16
apache-2.0
2.534
26
main
4
False
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
ibm-granite/granite-3.1-8b-instruct
0.4897
0
0.4612
0.1882
0.5498
0.732
0.676
0.8088
0.6339
0.3347
0.8951
0.1068
0.2781
0.8501
11.1067
0.8799
0.949
16.6705
0.8742
0.4612
0.7916
0.5259
0.6472
0.8222
0.436
0.4694
0.615
0.7241
0.6572
0.8951
0.8521
0.8131
0.5822
0.676
0
0
0.6303
0.2901
0.04
0.2771
0.0885
0.0202
0.5152
0.7714
7.6307
0.7619
0.8834
8.6784
0.7194
0.6999
3.1486
29.8696
10.6817
0.1068
24.8493
GraniteForCausalLM
bfloat16
apache-2.0
8.171
124
main
4
False
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
ibm-granite/granite-3.1-8b-instruct
0.3784
0
0
0.0551
0.5075
0.6142
0.414
0.8058
0.6015
0.1897
0.8682
0.1068
0.1949
0.8342
10.6665
0.8818
0.9458
14.6001
0.8688
0
0.7883
0.4885
0.6181
0.6256
0.1537
0.4199
0.5464
0.7361
0.6182
0.8682
0.8125
0.781
0.4286
0.414
0
0
0.595
0.2205
0
0
0.0088
0
0.2667
0.7511
6.5064
0.7585
0.8807
8.6202
0.7141
0.6999
3.1486
29.8696
10.6817
0.1068
24.8493
GraniteForCausalLM
bfloat16
apache-2.0
8.171
124
main
0
False
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
ibm-granite/granite-3.1-3b-a800m-instruct
0.3829
0
0.3821
0.0823
0.399
0.5627
0.506
0.7458
0.4081
0.2185
0.8158
0.092
0.1611
0.8121
8.1343
0.8198
0.929
11.125
0.8365
0.3821
0.6796
0.3563
0.525
0.605
0.3164
0.3344
0.3657
0.4773
0.3162
0.8158
-0.1174
-0.1113
0.4035
0.506
0
0
0.4636
0.178
0.0057
0.0928
0.0354
0.0156
0.2618
0.7063
5.2502
0.6537
0.8672
6.1114
0.6733
0.6873
2.116
27.5629
9.19
0.092
23.1874
GraniteMoeForCausalLM
bfloat16
apache-2.0
3.299
18
main
4
False
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
ibm-granite/granite-3.1-3b-a800m-instruct
0.2283
0
0
0.05
0.0072
0.3941
0.068
0.7158
0.3812
0.1138
0.6895
0.092
0.0736
0.7702
5.9758
0.7628
0.9214
10.0347
0.82
0
0.6057
0.3621
0.5
0.3003
0.145
0.0119
0.1467
0.6692
0.2279
0.6895
-0.0299
-0.0678
0.2764
0.068
0
0
0.0025
0.1227
0
0
0
0
0.25
0.6819
3.8124
0.6342
0.8592
5.9782
0.6462
0.6873
2.116
27.5629
9.19
0.092
23.1874
GraniteMoeForCausalLM
bfloat16
apache-2.0
3.299
18
main
0
False
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
ibm-granite/granite-3.1-1b-a400m-instruct
0.2913
0
0.2725
0.063
0.2519
0.3147
0.274
0.6509
0.4263
0.2195
0.6678
0.0632
0.1313
0.7571
6.0828
0.7033
0.9127
10.2086
0.7842
0.2725
0.4755
0.3678
0.5
0.2225
0.357
0.2457
0.3443
0.6307
0.2888
0.6678
-0.0213
0.0271
0.246
0.274
0
0
0.2582
0.1704
0.0233
0.0431
0.0177
0.016
0.2147
0.6461
4.2288
0.5127
0.8527
6.8219
0.6035
0.6479
1.5459
20.8561
6.3175
0.0632
16.9118
GraniteMoeForCausalLM
bfloat16
apache-2.0
1.335
11
main
4
False
v1.4.1
v0.6.3.post1
⭕ : instruction-tuned
ibm-granite/granite-3.1-1b-a400m-instruct
0.1841
0
0
0.0408
0.0609
0.2691
0.026
0.6265
0.4928
0.0679
0.3781
0.0632
0.0598
0.7245
4.5414
0.6486
0.9049
9.1863
0.7721
0
0.4672
0.3305
0.5
0.0929
0.0583
0.0006
0.544
0.6711
0.4187
0.3781
-0.0267
0.0396
0.2473
0.026
0
0
0.1213
0.0856
0
0
0
0
0.2041
0.6367
3.4966
0.5133
0.8376
5.5536
0.5721
0.6479
1.5459
20.8561
6.3175
0.0632
16.9118
GraniteMoeForCausalLM
bfloat16
apache-2.0
1.335
11
main
0
False
v1.4.1
v0.6.3.post1