eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
64 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.09k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.22
0.83
BBH
float64
0.25
76.7
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.7
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
525 values
Submission Date
stringclasses
263 values
Generation
int64
0
10
Base Model
stringlengths
4
102
godlikehhd_ifd_2500_qwen_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/godlikehhd/ifd_2500_qwen" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">godlikehhd/ifd_2500_qwen</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/godlikehhd__ifd_2500_qwen-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
godlikehhd/ifd_2500_qwen
d5311754b90d5628c5dea75d2edd9fed191139ef
16.206918
0
1.544
false
false
false
false
1.727397
0.336474
33.647389
0.42983
19.136234
0.098187
9.818731
0.295302
6.040268
0.361469
7.25026
0.292138
21.348626
false
false
2024-12-31
0
Removed
godlikehhd_ifd_new_correct_all_sample_2500_qwen_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/godlikehhd/ifd_new_correct_all_sample_2500_qwen" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">godlikehhd/ifd_new_correct_all_sample_2500_qwen</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/godlikehhd__ifd_new_correct_all_sample_2500_qwen-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
godlikehhd/ifd_new_correct_all_sample_2500_qwen
dcb4ee52aafbc24f691c2ffaeb04627d61df727e
15.324012
0
1.544
false
false
false
false
1.782292
0.337573
33.757319
0.401964
15.682841
0.095921
9.592145
0.290268
5.369128
0.356167
6.554167
0.288896
20.988475
false
false
2024-12-31
0
Removed
godlikehhd_ifd_new_correct_sample_2500_qwen_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/godlikehhd/ifd_new_correct_sample_2500_qwen" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">godlikehhd/ifd_new_correct_sample_2500_qwen</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/godlikehhd__ifd_new_correct_sample_2500_qwen-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
godlikehhd/ifd_new_correct_sample_2500_qwen
8e9ed2c5bd0f5cdf2a6d3bddd28fdeab1a49c4c7
16.393783
0
1.544
false
false
false
false
1.83534
0.339746
33.974632
0.411031
16.877004
0.10423
10.422961
0.307886
7.718121
0.362677
7.901302
0.293218
21.468676
false
false
2024-12-31
0
Removed
godlikehhd_ifd_new_qwen_2500_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/godlikehhd/ifd_new_qwen_2500" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">godlikehhd/ifd_new_qwen_2500</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/godlikehhd__ifd_new_qwen_2500-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
godlikehhd/ifd_new_qwen_2500
10d4c5b0deb3468fc606089ea026c07dc7674c2a
15.886984
0
1.544
false
false
false
false
2.001242
0.323959
32.395932
0.415982
17.63795
0.111782
11.178248
0.300336
6.711409
0.358958
6.169792
0.291057
21.228576
false
false
2024-12-31
0
Removed
godlikehhd_qwen-2.5-1.5b-cherry_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/godlikehhd/qwen-2.5-1.5b-cherry" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">godlikehhd/qwen-2.5-1.5b-cherry</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/godlikehhd__qwen-2.5-1.5b-cherry-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
godlikehhd/qwen-2.5-1.5b-cherry
d9a8a826182d4703fbaaf749b93c4434b3a72c5e
14.683674
0
0.772
false
false
false
false
1.894311
0.289338
28.933785
0.403576
16.323589
0.101964
10.196375
0.300336
6.711409
0.345625
4.569792
0.292304
21.367095
false
false
2024-12-29
0
Removed
godlikehhd_qwen_2.5-1.5b-cherry_new_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/godlikehhd/qwen_2.5-1.5b-cherry_new" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">godlikehhd/qwen_2.5-1.5b-cherry_new</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/godlikehhd__qwen_2.5-1.5b-cherry_new-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
godlikehhd/qwen_2.5-1.5b-cherry_new
e7d49cb02c1112c5ecbd5064e355dd57eb4bb481
15.370243
0
1.544
false
false
false
false
1.782674
0.312044
31.204426
0.414963
17.597085
0.096677
9.667674
0.297819
6.375839
0.349594
6.332552
0.289395
21.043883
false
false
2024-12-29
0
Removed
godlikehhd_qwen_full_data_alpaca_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/godlikehhd/qwen_full_data_alpaca" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">godlikehhd/qwen_full_data_alpaca</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/godlikehhd__qwen_full_data_alpaca-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
godlikehhd/qwen_full_data_alpaca
8fb1e35250496473c58b24abda40eed086b30614
15.856125
0
1.544
false
false
false
false
1.64641
0.313618
31.361787
0.422921
18.614216
0.092145
9.214502
0.292785
5.704698
0.405156
9.677865
0.285073
20.563682
false
false
2024-12-29
0
Removed
godlikehhd_qwen_ins_ans_2500_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/godlikehhd/qwen_ins_ans_2500" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">godlikehhd/qwen_ins_ans_2500</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/godlikehhd__qwen_ins_ans_2500-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
godlikehhd/qwen_ins_ans_2500
8eb2e22119485500990775f74289a271ab0051f1
14.783348
0
1.544
false
false
false
true
3.385133
0.269804
26.980412
0.407395
17.695312
0.114048
11.404834
0.291946
5.592841
0.358865
6.92474
0.280918
20.10195
false
false
2024-12-29
0
Removed
google_codegemma-1.1-2b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/codegemma-1.1-2b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/codegemma-1.1-2b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__codegemma-1.1-2b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/codegemma-1.1-2b
9d69e500da236427eab5867552ffc87108964f4d
7.133868
gemma
17
2.506
true
false
false
false
1.899766
0.229363
22.936254
0.335342
7.551225
0.01284
1.283988
0.265101
2.013423
0.387146
5.926563
0.127826
3.091755
false
true
2024-04-30
2024-08-12
0
google/codegemma-1.1-2b
google_flan-t5-base_float16
float16
🟢 pretrained
🟢
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/flan-t5-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/flan-t5-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__flan-t5-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/flan-t5-base
7bcac572ce56db69c1ea7c8af255c5d7c9672fc2
6.415642
apache-2.0
839
0.248
true
false
false
false
0.313243
0.189071
18.907056
0.352598
11.337694
0.010574
1.057402
0.238255
0
0.367115
3.222656
0.135721
3.969046
false
true
2022-10-21
2024-08-14
0
google/flan-t5-base
google_flan-t5-large_float16
float16
🟢 pretrained
🟢
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/flan-t5-large" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/flan-t5-large</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__flan-t5-large-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/flan-t5-large
0613663d0d48ea86ba8cb3d7a44f0f65dc596a2a
9.658123
apache-2.0
713
0.783
true
false
false
false
0.466983
0.220095
22.00949
0.415312
17.510018
0.01435
1.435045
0.250839
0.111857
0.408323
9.007031
0.170878
7.875296
false
true
2022-10-21
2024-08-14
0
google/flan-t5-large
google_flan-t5-small_float16
float16
🟢 pretrained
🟢
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/flan-t5-small" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/flan-t5-small</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__flan-t5-small-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/flan-t5-small
0fc9ddf78a1e988dac52e2dac162b0ede4fd74ab
6.129662
apache-2.0
323
0.077
true
false
false
false
0.28626
0.152426
15.242556
0.32829
6.363112
0.007553
0.755287
0.260906
1.454139
0.412292
10.369792
0.123338
2.593085
false
true
2022-10-21
2024-06-27
0
google/flan-t5-small
google_flan-t5-xl_float16
float16
🟢 pretrained
🟢
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/flan-t5-xl" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/flan-t5-xl</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__flan-t5-xl-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/flan-t5-xl
7d6315df2c2fb742f0f5b556879d730926ca9001
11.705073
apache-2.0
485
2.85
true
false
false
false
0.697859
0.223742
22.374189
0.453106
22.695056
0.007553
0.755287
0.252517
0.33557
0.418094
11.328385
0.214678
12.741947
false
true
2022-10-21
2024-08-07
0
google/flan-t5-xl
google_flan-t5-xl_bfloat16
bfloat16
🟢 pretrained
🟢
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/flan-t5-xl" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/flan-t5-xl</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__flan-t5-xl-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/flan-t5-xl
7d6315df2c2fb742f0f5b556879d730926ca9001
11.587167
apache-2.0
485
2.85
true
false
false
false
0.285352
0.220694
22.069442
0.453722
22.837588
0.000755
0.075529
0.245805
0
0.422031
11.853906
0.214179
12.68654
false
true
2022-10-21
2024-08-07
0
google/flan-t5-xl
google_flan-t5-xxl_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/flan-t5-xxl" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/flan-t5-xxl</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__flan-t5-xxl-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/flan-t5-xxl
ae7c9136adc7555eeccc78cdd960dfd60fb346ce
13.662077
apache-2.0
1,235
11.267
true
false
false
false
1.412954
0.220045
22.004504
0.506589
30.119256
0.010574
1.057402
0.270134
2.684564
0.42175
11.185417
0.234292
14.921321
false
true
2022-10-21
2024-09-06
0
google/flan-t5-xxl
google_flan-ul2_bfloat16
bfloat16
🟢 pretrained
🟢
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/flan-ul2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/flan-ul2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__flan-ul2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/flan-ul2
452d74ce28ac4a7f211d6ba3ef0717027f7a8074
13.675999
apache-2.0
553
19.46
true
false
false
false
1.119933
0.239254
23.925407
0.505374
30.02029
0.009063
0.906344
0.287752
5.033557
0.384354
5.577604
0.249335
16.59279
false
true
2023-03-03
2024-08-07
0
google/flan-ul2
google_gemma-1.1-2b-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-1.1-2b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-1.1-2b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-1.1-2b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-1.1-2b-it
bf4924f313df5166dee1467161e886e55f2eb4d4
8.053374
gemma
157
2.506
true
false
false
true
0.65843
0.306748
30.674832
0.318463
5.862827
0.018127
1.812689
0.269295
2.572707
0.339396
2.024479
0.148354
5.37271
false
true
2024-03-26
2024-06-12
0
google/gemma-1.1-2b-it
google_gemma-1.1-7b-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-1.1-7b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-1.1-7b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-1.1-7b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-1.1-7b-it
16128b0aeb50762ea96430c0c06a37941bf9f274
17.693584
gemma
270
8.538
true
false
false
true
1.156598
0.503911
50.391073
0.39353
15.934209
0.049094
4.909366
0.293624
5.816555
0.423021
11.510938
0.258394
17.599365
false
true
2024-03-26
2024-06-12
0
google/gemma-1.1-7b-it
google_gemma-2-27b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2-27b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2-27b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2-27b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2-27b
938270f5272feb02779b55c2bb2fffdd0f53ff0c
23.926167
gemma
205
27.227
true
false
false
false
11.228499
0.247522
24.752213
0.564291
37.390737
0.166163
16.616314
0.350671
13.422819
0.439635
13.921094
0.437084
37.453827
false
true
2024-06-24
2024-08-24
0
google/gemma-2-27b
google_gemma-2-27b-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2-27b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2-27b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2-27b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2-27b-it
f6c533e5eb013c7e31fc74ef042ac4f3fb5cf40b
36.174283
gemma
541
27.227
true
false
false
true
9.652422
0.797768
79.77677
0.645139
49.272842
0.238671
23.867069
0.375
16.666667
0.403302
9.11276
0.445146
38.349586
false
true
2024-06-24
2024-08-07
1
google/gemma-2-27b
google_gemma-2-2b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
InternLM2ForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2-2b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2-2b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2-2b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2-2b
4d05c88d00441bf62bf87dcfd29e204c05089f36
10.129463
gemma
528
2.614
true
false
false
true
1.518796
0.199312
19.931227
0.365597
11.755808
0.028701
2.870091
0.262584
1.677852
0.423177
11.430469
0.218002
13.111333
false
true
2024-07-16
2024-07-31
0
google/gemma-2-2b
google_gemma-2-2b_float16
float16
🟢 pretrained
🟢
Original
InternLM2ForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2-2b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2-2b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2-2b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2-2b
0738188b3055bc98daf0fe7211f0091357e5b979
10.359616
gemma
528
2.614
true
false
false
true
2.836515
0.20176
20.176022
0.370867
12.497306
0.030211
3.021148
0.262584
1.677852
0.421875
11.267708
0.221659
13.517657
false
true
2024-07-16
2024-07-31
0
google/gemma-2-2b
google_gemma-2-2b-it_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
InternLM2ForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2-2b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2-2b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2-2b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2-2b-it
2b6ac3ff954ad896c115bbfa1b571cd93ea2c20f
17.046939
gemma
1,025
2.614
true
false
false
true
1.234743
0.566834
56.683378
0.419923
17.980793
0.000755
0.075529
0.274329
3.243848
0.392885
7.077344
0.254987
17.220745
false
true
2024-07-16
2024-07-31
1
google/gemma-2-2b
google_gemma-2-2b-jpn-it_float16
float16
🟢 pretrained
🟢
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2-2b-jpn-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2-2b-jpn-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2-2b-jpn-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2-2b-jpn-it
6b046bbc091084a1ec89fe03e58871fde10868eb
17.115406
gemma
163
2.614
true
false
false
false
1.011437
0.507783
50.778268
0.422557
18.525626
0.034743
3.47432
0.285235
4.697987
0.396385
7.68151
0.257813
17.534722
false
true
2024-09-25
2024-10-11
2
google/gemma-2-2b
google_gemma-2-2b-jpn-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2-2b-jpn-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2-2b-jpn-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2-2b-jpn-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2-2b-jpn-it
6b046bbc091084a1ec89fe03e58871fde10868eb
16.67863
gemma
163
2.614
true
false
false
true
1.7088
0.52884
52.884014
0.417844
17.848086
0.047583
4.758308
0.275168
3.355705
0.37276
4.928385
0.246676
16.297281
false
true
2024-09-25
2024-10-14
2
google/gemma-2-2b
google_gemma-2-9b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2-9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2-9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2-9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2-9b
beb0c08e9eeb0548f3aca2ac870792825c357b7d
21.205287
gemma
653
9
true
false
false
false
8.61624
0.203983
20.398321
0.537737
34.096819
0.134441
13.444109
0.328859
10.514541
0.446115
14.297656
0.410322
34.480275
false
true
2024-06-24
2024-07-11
0
google/gemma-2-9b
google_gemma-2-9b-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2-9b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2-9b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2-9b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2-9b-it
1937c70277fcc5f7fb0fc772fc5bc69378996e71
32.07276
gemma
690
9
true
false
false
true
7.544268
0.743563
74.356264
0.599034
42.13662
0.194864
19.486405
0.360738
14.765101
0.407271
9.742188
0.38755
31.949985
false
true
2024-06-24
2024-07-11
1
google/gemma-2-9b
google_gemma-2b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2b
2ac59a5d7bf4e1425010f0d457dde7d146658953
7.32196
gemma
985
2.506
true
false
false
false
1.295989
0.203758
20.375825
0.336564
8.246263
0.030211
3.021148
0.255034
0.671141
0.397781
7.55599
0.136553
4.061392
false
true
2024-02-08
2024-06-12
0
google/gemma-2b
google_gemma-2b-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-2b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-2b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-2b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-2b-it
de144fb2268dee1066f515465df532c05e699d48
7.485804
gemma
713
2.506
true
false
false
true
0.705901
0.26903
26.902951
0.315082
5.214303
0.020393
2.039275
0.278523
3.803132
0.334125
3.032292
0.135306
3.922872
false
true
2024-02-08
2024-06-12
0
google/gemma-2b-it
google_gemma-7b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-7b
a0eac5b80dba224e6ed79d306df50b1e92c2125d
15.442819
gemma
3,134
8.538
true
false
false
false
2.509828
0.265932
26.593217
0.436153
21.116099
0.074018
7.401813
0.286913
4.9217
0.40624
10.979948
0.294797
21.644134
false
true
2024-02-08
2024-06-08
0
google/gemma-7b
google_gemma-7b-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/gemma-7b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/gemma-7b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__gemma-7b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/gemma-7b-it
18329f019fb74ca4b24f97371785268543d687d2
13.067087
gemma
1,158
8.538
true
false
false
true
1.199025
0.386832
38.683249
0.36459
11.940832
0.029456
2.945619
0.284396
4.58613
0.427427
12.528385
0.169465
7.718307
false
true
2024-02-13
2024-06-12
1
google/gemma-7b
google_mt5-base_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MT5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/mt5-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/mt5-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__mt5-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/mt5-base
2eb15465c5dd7f72a8f7984306ad05ebc3dd1e1f
3.71634
apache-2.0
220
0.39
true
false
false
false
0.40008
0.164516
16.451571
0.288316
1.298551
0.009063
0.906344
0.239094
0
0.367208
2.867708
0.106965
0.773862
false
true
2022-03-02
2024-09-06
0
google/mt5-base
google_mt5-small_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MT5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/mt5-small" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/mt5-small</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__mt5-small-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/mt5-small
73fb5dbe4756edadc8fbe8c769b0a109493acf7a
4.255928
apache-2.0
139
0.17
true
false
false
false
0.360987
0.17181
17.180969
0.276584
1.070971
0
0
0.24245
0
0.38575
5.91875
0.112284
1.364879
false
true
2022-03-02
2024-09-06
0
google/mt5-small
google_mt5-xl_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MT5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/mt5-xl" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/mt5-xl</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__mt5-xl-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/mt5-xl
63fc6450d80515b48e026b69ef2fbbd426433e84
5.19142
apache-2.0
23
3.23
true
false
false
false
1.807534
0.195964
19.596449
0.304736
3.282462
0
0
0.264262
1.901566
0.379521
5.040104
0.111951
1.32794
false
true
2022-03-02
2024-09-06
0
google/mt5-xl
google_mt5-xxl_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
T5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/mt5-xxl" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/mt5-xxl</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__mt5-xxl-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/mt5-xxl
e07c395916dfbc315d4e5e48b4a54a1e8821b5c0
5.103077
apache-2.0
68
11.9
true
false
false
false
4.563877
0.235757
23.575668
0.295934
2.504711
0
0
0.241611
0
0.368948
3.551823
0.108876
0.986259
false
true
2022-03-02
2024-09-06
0
google/mt5-xxl
google_recurrentgemma-2b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
RecurrentGemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/recurrentgemma-2b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/recurrentgemma-2b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__recurrentgemma-2b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/recurrentgemma-2b
195f13c55b371fc721eda0662c00c64642c70e17
7.015127
gemma
91
2.683
true
false
false
false
5.535049
0.301703
30.170282
0.319736
4.820362
0.020393
2.039275
0.245805
0
0.344573
3.104948
0.117603
1.955895
false
true
2024-04-06
2024-06-13
0
google/recurrentgemma-2b
google_recurrentgemma-2b-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
RecurrentGemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/recurrentgemma-2b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/recurrentgemma-2b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__recurrentgemma-2b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/recurrentgemma-2b-it
150248167d171fbdf4b02e7d28a4b3d749e570f6
7.995905
gemma
110
2.683
true
false
false
true
3.059051
0.294933
29.4933
0.333
7.978764
0.019637
1.963746
0.253356
0.447427
0.334063
3.624479
0.140209
4.467716
false
true
2024-04-08
2024-06-12
0
google/recurrentgemma-2b-it
google_recurrentgemma-9b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
RecurrentGemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/recurrentgemma-9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/recurrentgemma-9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__recurrentgemma-9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/recurrentgemma-9b
7b0ed98fb889ba8bdfa7c690f08f2e57a7c48dae
13.709461
gemma
58
9
true
false
false
false
26.087377
0.311594
31.159435
0.395626
15.323369
0.066465
6.646526
0.285235
4.697987
0.38026
6.599219
0.260472
17.83023
false
true
2024-06-07
2024-07-04
0
google/recurrentgemma-9b
google_recurrentgemma-9b-it_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
RecurrentGemmaForCausalLM
<a target="_blank" href="https://huggingface.co/google/recurrentgemma-9b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/recurrentgemma-9b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__recurrentgemma-9b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/recurrentgemma-9b-it
43e62f98c3d496a5469ef4b18c1b11e417d68d1d
19.218115
gemma
51
9
true
false
false
true
15.127758
0.501038
50.103836
0.436719
21.62158
0.066465
6.646526
0.270134
2.684564
0.437906
13.771615
0.284325
20.48057
false
true
2024-06-07
2024-07-05
0
google/recurrentgemma-9b-it
google_switch-base-8_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
SwitchTransformersForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/switch-base-8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/switch-base-8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__switch-base-8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/switch-base-8
92fe2d22b024d9937146fe097ba3d3a7ba146e1b
3.29595
apache-2.0
16
0.62
true
false
false
false
0.293406
0.158521
15.85205
0.287631
1.702478
0
0
0.25
0
0.35174
1.133333
0.109791
1.08784
false
true
2022-10-24
2024-09-06
0
google/switch-base-8
google_umt5-base_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
UMT5ForConditionalGeneration
<a target="_blank" href="https://huggingface.co/google/umt5-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">google/umt5-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/google__umt5-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
google/umt5-base
0de9394d54f8975e71838d309de1cb496c894ab9
3.516575
apache-2.0
13
-1
true
false
false
false
1.336092
0.174632
17.46322
0.278773
0.813553
0.004532
0.453172
0.254195
0.559284
0.338219
0.94401
0.107796
0.866209
false
true
2023-07-02
2024-09-06
0
google/umt5-base
goulue5_merging_LLM_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/goulue5/merging_LLM" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">goulue5/merging_LLM</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/goulue5__merging_LLM-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
goulue5/merging_LLM
587115b34d72ef957fee2d8348b3ade3ae06d4a8
16.7121
2
1.544
false
false
false
false
1.10293
0.32326
32.326006
0.42165
18.28283
0.096677
9.667674
0.291107
5.480984
0.433281
12.760156
0.295795
21.75495
false
false
2024-11-21
2024-11-22
0
goulue5/merging_LLM
gpt2_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/gpt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gpt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gpt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gpt2
607a30d783dfa663caf39e06633721c8d4cfcd7e
6.39103
mit
2,628
0.137
true
false
false
false
0.323928
0.193417
19.34168
0.303639
2.714298
0.003021
0.302115
0.260067
1.342282
0.432417
12.985417
0.114943
1.660387
false
true
2022-03-02
2024-06-26
0
gpt2
gpt2_float16
float16
🟢 pretrained
🟢
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/gpt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gpt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gpt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gpt2
607a30d783dfa663caf39e06633721c8d4cfcd7e
5.977737
mit
2,628
0.137
true
false
false
false
0.039245
0.083333
8.333333
0.308333
9.199755
0
0
0.233333
0
0.433333
18.333333
0.1
0
false
true
2022-03-02
2024-06-26
0
gpt2
gradientai_Llama-3-8B-Instruct-Gradient-1048k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gradientai/Llama-3-8B-Instruct-Gradient-1048k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gradientai/Llama-3-8B-Instruct-Gradient-1048k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gradientai__Llama-3-8B-Instruct-Gradient-1048k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gradientai/Llama-3-8B-Instruct-Gradient-1048k
8697fb25cb77c852311e03b4464b8467471d56a4
18.283334
llama3
682
8.03
true
false
false
true
1.774329
0.445559
44.555889
0.43459
21.010529
0.053625
5.362538
0.277685
3.691275
0.42975
13.51875
0.294049
21.561022
false
true
2024-04-29
2024-06-12
0
gradientai/Llama-3-8B-Instruct-Gradient-1048k
grimjim_DeepSauerHuatuoSkywork-R1-o1-Llama-3.1-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/DeepSauerHuatuoSkywork-R1-o1-Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/DeepSauerHuatuoSkywork-R1-o1-Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__DeepSauerHuatuoSkywork-R1-o1-Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/DeepSauerHuatuoSkywork-R1-o1-Llama-3.1-8B
e62f1fd16eebdfa3f3084a44a8a37176ecb4074f
26.94048
llama3.1
4
8.03
true
false
false
false
1.369264
0.479706
47.970607
0.52694
32.769235
0.222054
22.205438
0.338087
11.744966
0.440781
14.097656
0.395695
32.854979
true
false
2025-01-30
2025-01-30
1
grimjim/DeepSauerHuatuoSkywork-R1-o1-Llama-3.1-8B (Merge)
grimjim_Gigantes-v1-gemma2-9b-it_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Gigantes-v1-gemma2-9b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Gigantes-v1-gemma2-9b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__Gigantes-v1-gemma2-9b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Gigantes-v1-gemma2-9b-it
f912b7cf7f07a593d0a4262f9f20a7adb0a93f9d
33.237428
gemma
2
9.242
true
false
false
false
2.995649
0.692455
69.245491
0.597793
42.797877
0.214502
21.450151
0.353188
13.758389
0.455479
16.334896
0.42254
35.837766
true
false
2024-12-28
2024-12-28
1
grimjim/Gigantes-v1-gemma2-9b-it (Merge)
grimjim_Gigantes-v2-gemma2-9b-it_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Gigantes-v2-gemma2-9b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Gigantes-v2-gemma2-9b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__Gigantes-v2-gemma2-9b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Gigantes-v2-gemma2-9b-it
5c410fbc679de69de48b25d18bb7e374f4a3471f
33.876786
gemma
1
9.242
true
false
false
false
3.037732
0.73507
73.506962
0.598656
42.701633
0.201662
20.166163
0.35151
13.534676
0.459479
17.134896
0.425947
36.216386
true
false
2024-12-29
2024-12-29
1
grimjim/Gigantes-v2-gemma2-9b-it (Merge)
grimjim_Gigantes-v3-gemma2-9b-it_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Gigantes-v3-gemma2-9b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Gigantes-v3-gemma2-9b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__Gigantes-v3-gemma2-9b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Gigantes-v3-gemma2-9b-it
dcd1a7dd037a17f07819da0c70b96e987678206c
33.490437
gemma
0
9.242
true
false
false
false
3.035864
0.697626
69.762563
0.598351
42.795368
0.20997
20.996979
0.356544
14.205817
0.460813
17.334896
0.422623
35.847001
true
false
2024-12-29
2024-12-29
1
grimjim/Gigantes-v3-gemma2-9b-it (Merge)
grimjim_HuatuoSkywork-o1-Llama-3.1-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/HuatuoSkywork-o1-Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/HuatuoSkywork-o1-Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__HuatuoSkywork-o1-Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/HuatuoSkywork-o1-Llama-3.1-8B
85bfa74ab2642914a532eb52a1720da5bc2afb00
24.477999
llama3.1
0
8.03
true
false
false
false
1.397147
0.39615
39.614999
0.488636
28.332778
0.388218
38.821752
0.292785
5.704698
0.383854
11.115104
0.309508
23.278664
true
false
2025-01-02
2025-01-03
1
grimjim/HuatuoSkywork-o1-Llama-3.1-8B (Merge)
grimjim_Llama-3-Instruct-8B-SPPO-Iter3-SimPO-merge_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Llama-3-Instruct-8B-SPPO-Iter3-SimPO-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Llama-3-Instruct-8B-SPPO-Iter3-SimPO-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__Llama-3-Instruct-8B-SPPO-Iter3-SimPO-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Llama-3-Instruct-8B-SPPO-Iter3-SimPO-merge
7a8d334dce0a2ce948f75612b8d3a61c53d094aa
20.836684
llama3
3
8.03
true
false
false
false
1.095097
0.427124
42.712447
0.496169
28.258015
0.099698
9.969789
0.290268
5.369128
0.404323
9.540365
0.362533
29.170361
true
false
2024-06-28
2024-06-29
1
grimjim/Llama-3-Instruct-8B-SPPO-Iter3-SimPO-merge (Merge)
grimjim_Llama-3-Instruct-8B-SimPO-SPPO-Iter3-merge_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Llama-3-Instruct-8B-SimPO-SPPO-Iter3-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Llama-3-Instruct-8B-SimPO-SPPO-Iter3-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__Llama-3-Instruct-8B-SimPO-SPPO-Iter3-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Llama-3-Instruct-8B-SimPO-SPPO-Iter3-merge
8f4d460ea20e24e48914156af7def305c0cd347f
24.040942
llama3
3
8
true
false
false
true
1.233883
0.68059
68.058972
0.502173
29.073286
0.089124
8.912387
0.262584
1.677852
0.38851
6.697135
0.368434
29.82602
true
false
2024-06-28
2024-09-17
1
grimjim/Llama-3-Instruct-8B-SimPO-SPPO-Iter3-merge (Merge)
grimjim_Llama-3.1-8B-Instruct-abliterated_via_adapter_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Llama-3.1-8B-Instruct-abliterated_via_adapter" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Llama-3.1-8B-Instruct-abliterated_via_adapter</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__Llama-3.1-8B-Instruct-abliterated_via_adapter-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Llama-3.1-8B-Instruct-abliterated_via_adapter
b37ab2f859c96b125ff1c45c7ff0e267aa229156
23.217302
llama3.1
29
8.03
true
false
false
false
1.803829
0.48695
48.695018
0.510527
29.41599
0.139728
13.97281
0.313758
8.501119
0.401031
9.26224
0.36511
29.456634
true
false
2024-07-25
2024-09-17
1
grimjim/Llama-3.1-8B-Instruct-abliterated_via_adapter (Merge)
grimjim_Llama-3.1-Bonsaikraft-8B-Instruct_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Llama-3.1-Bonsaikraft-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Llama-3.1-Bonsaikraft-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__Llama-3.1-Bonsaikraft-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Llama-3.1-Bonsaikraft-8B-Instruct
3291a835eb6591c97484502476ebbde38380849b
22.85664
llama3.1
1
8.03
true
false
false
false
1.356011
0.425001
42.500122
0.528686
32.587543
0.13142
13.141994
0.303691
7.158837
0.42351
11.038802
0.376413
30.712544
true
false
2024-11-30
2025-01-25
1
grimjim/Llama-3.1-Bonsaikraft-8B-Instruct (Merge)
grimjim_Llama-Nephilim-Metamorphosis-v2-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Llama-Nephilim-Metamorphosis-v2-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Llama-Nephilim-Metamorphosis-v2-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__Llama-Nephilim-Metamorphosis-v2-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Llama-Nephilim-Metamorphosis-v2-8B
bef087901c4163991ec649608c7c07c759e4fe7c
23.002213
llama3.1
2
8.03
true
false
false
false
1.345998
0.454452
45.445197
0.501348
28.182462
0.139728
13.97281
0.322987
9.731544
0.409094
9.470052
0.380901
31.211215
true
false
2024-10-14
2025-01-02
1
grimjim/Llama-Nephilim-Metamorphosis-v2-8B (Merge)
grimjim_Llama3.1-SuperNovaLite-HuatuoSkywork-o1-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Llama3.1-SuperNovaLite-HuatuoSkywork-o1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Llama3.1-SuperNovaLite-HuatuoSkywork-o1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__Llama3.1-SuperNovaLite-HuatuoSkywork-o1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Llama3.1-SuperNovaLite-HuatuoSkywork-o1-8B
d4012c41f98f7f1fd33104a48d49319b382e8554
25.961665
llama3.1
1
8.03
true
false
false
false
1.343999
0.436592
43.659158
0.528719
32.952973
0.300604
30.060423
0.311242
8.165548
0.399854
11.115104
0.368351
29.816785
true
false
2025-01-07
2025-01-07
1
grimjim/Llama3.1-SuperNovaLite-HuatuoSkywork-o1-8B (Merge)
grimjim_Magnolia-v1-Gemma2-8k-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Magnolia-v1-Gemma2-8k-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Magnolia-v1-Gemma2-8k-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__Magnolia-v1-Gemma2-8k-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Magnolia-v1-Gemma2-8k-9B
4a25a9e75e598bf249d92d84ed2cf5c707fa502e
25.512848
gemma
1
9.242
true
false
false
false
4.074025
0.353085
35.308537
0.558903
36.790012
0.168429
16.8429
0.336409
11.521253
0.464469
16.591927
0.424202
36.022459
true
false
2024-10-08
2024-12-30
1
grimjim/Magnolia-v1-Gemma2-8k-9B (Merge)
grimjim_Magnolia-v2-12B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Magnolia-v2-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Magnolia-v2-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__Magnolia-v2-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Magnolia-v2-12B
9a66fd94f74bc63b4b944969ca0777107a97d47b
21.577222
apache-2.0
1
12.248
true
false
false
false
2.002732
0.350612
35.061193
0.529028
32.504625
0.129154
12.915408
0.318792
9.17226
0.417125
10.907292
0.360123
28.902556
true
false
2024-12-05
2024-12-30
1
grimjim/Magnolia-v2-12B (Merge)
grimjim_Magnolia-v2-Gemma2-8k-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Magnolia-v2-Gemma2-8k-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Magnolia-v2-Gemma2-8k-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__Magnolia-v2-Gemma2-8k-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Magnolia-v2-Gemma2-8k-9B
f9617a4216e6719b776f426dde6dce654cd93510
34.301345
gemma
3
9.242
true
false
false
false
3.329443
0.738442
73.844178
0.601577
42.844617
0.228097
22.809668
0.357383
14.317673
0.448844
14.972135
0.433178
37.019799
true
false
2024-10-09
2024-12-29
1
grimjim/Magnolia-v2-Gemma2-8k-9B (Merge)
grimjim_Magnolia-v3-12B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Magnolia-v3-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Magnolia-v3-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__Magnolia-v3-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Magnolia-v3-12B
b93104eefc85fc527b87973f79e28d149eb5135e
22.786766
apache-2.0
3
12.248
true
false
false
false
2.000108
0.396499
39.649907
0.532667
32.924916
0.135196
13.519637
0.325503
10.067114
0.418396
11.499479
0.361536
29.059545
true
false
2024-12-11
2024-12-28
1
grimjim/Magnolia-v3-12B (Merge)
grimjim_Magnolia-v3-Gemma2-8k-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Magnolia-v3-Gemma2-8k-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Magnolia-v3-Gemma2-8k-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__Magnolia-v3-Gemma2-8k-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Magnolia-v3-Gemma2-8k-9B
89cd67348bdb122cb8986053e66df4348cb1044a
34.353725
gemma
2
9.242
true
false
false
false
3.417031
0.737842
73.784226
0.601541
42.86823
0.231873
23.187311
0.356544
14.205817
0.448813
15.001563
0.433677
37.075207
true
false
2024-12-30
2024-12-30
1
grimjim/Magnolia-v3-Gemma2-8k-9B (Merge)
grimjim_Magnolia-v4-12B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Magnolia-v4-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Magnolia-v4-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__Magnolia-v4-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Magnolia-v4-12B
45605f7d55339291145798b55f3e613b0dfe9b11
22.59384
apache-2.0
2
12.248
true
false
false
false
0.823437
0.341794
34.179422
0.543089
34.577483
0.13142
13.141994
0.32802
10.402685
0.421125
13.573958
0.367188
29.6875
true
false
2025-01-11
2025-02-21
1
grimjim/Magnolia-v4-12B (Merge)
grimjim_Magnolia-v5a-12B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Magnolia-v5a-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Magnolia-v5a-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__Magnolia-v5a-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Magnolia-v5a-12B
e9e38c0b451c9e7fe96ffd58a669271d684c6a2d
22.851795
apache-2.0
2
12.248
true
false
false
false
0.848583
0.411362
41.136185
0.531176
32.694923
0.137462
13.746224
0.322148
9.619687
0.41449
11.011198
0.360123
28.902556
true
false
2025-02-21
2025-02-21
1
grimjim/Magnolia-v5a-12B (Merge)
grimjim_Magot-v1-Gemma2-8k-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Magot-v1-Gemma2-8k-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Magot-v1-Gemma2-8k-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__Magot-v1-Gemma2-8k-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Magot-v1-Gemma2-8k-9B
afae94acb42bc0dcf1d31b7338cb79c0bcab1829
24.587403
gemma
2
9.242
true
false
false
false
5.908074
0.299678
29.967819
0.601945
42.818128
0.098943
9.89426
0.346477
12.863535
0.448844
14.905469
0.433677
37.075207
true
false
2024-09-09
2024-09-19
1
grimjim/Magot-v1-Gemma2-8k-9B (Merge)
grimjim_Magot-v2-Gemma2-8k-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Magot-v2-Gemma2-8k-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Magot-v2-Gemma2-8k-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__Magot-v2-Gemma2-8k-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Magot-v2-Gemma2-8k-9B
e214d3e65d6efd4a0a9209ac615c1735aac71ec7
32.979956
gemma
1
9.242
true
false
false
false
3.096882
0.734745
73.474492
0.589671
41.459291
0.201662
20.166163
0.354027
13.870246
0.434396
13.099479
0.422291
35.810062
true
false
2024-09-27
2024-12-30
1
grimjim/Magot-v2-Gemma2-8k-9B (Merge)
grimjim_SauerHuatuoSkywork-o1-Llama-3.1-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/SauerHuatuoSkywork-o1-Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/SauerHuatuoSkywork-o1-Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__SauerHuatuoSkywork-o1-Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/SauerHuatuoSkywork-o1-Llama-3.1-8B
a832b371328483d2f003ea87afb4e1dc58045fdc
26.684472
llama3.1
2
8.03
true
false
false
false
1.399418
0.521946
52.194621
0.522208
32.088772
0.172961
17.296073
0.321309
9.50783
0.452687
15.785937
0.399102
33.233599
true
false
2025-01-27
2025-01-27
1
grimjim/SauerHuatuoSkywork-o1-Llama-3.1-8B (Merge)
grimjim_llama-3-Nephilim-v1-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/llama-3-Nephilim-v1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/llama-3-Nephilim-v1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__llama-3-Nephilim-v1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/llama-3-Nephilim-v1-8B
642799c8c768c53e831a03a1224db875116be866
21.729737
cc-by-nc-4.0
1
8.03
true
false
false
false
1.713745
0.427724
42.772399
0.513182
29.907537
0.090634
9.063444
0.302013
6.935123
0.413625
10.636458
0.379571
31.06346
true
false
2024-06-21
2024-06-26
1
grimjim/llama-3-Nephilim-v1-8B (Merge)
grimjim_llama-3-Nephilim-v2-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/llama-3-Nephilim-v2-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/llama-3-Nephilim-v2-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__llama-3-Nephilim-v2-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/llama-3-Nephilim-v2-8B
924f56cdefbfaf38deb6aee3ad301ced027e142d
20.60025
cc-by-nc-4.0
1
8.03
true
false
false
false
1.401747
0.392228
39.222818
0.504821
29.896264
0.106495
10.649547
0.299497
6.599553
0.3895
7.8875
0.364112
29.345819
true
false
2024-06-26
2024-09-18
1
grimjim/llama-3-Nephilim-v2-8B (Merge)
grimjim_llama-3-Nephilim-v2.1-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/llama-3-Nephilim-v2.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/llama-3-Nephilim-v2.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__llama-3-Nephilim-v2.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/llama-3-Nephilim-v2.1-8B
5f516d9df1778dbe53ea941a754aef73b87e8eaa
20.434967
cc-by-nc-4.0
1
8.03
true
false
false
false
1.426317
0.389505
38.95054
0.509504
29.819664
0.099698
9.969789
0.299497
6.599553
0.3935
7.8875
0.364445
29.382757
true
false
2024-07-09
2024-09-18
1
grimjim/llama-3-Nephilim-v2.1-8B (Merge)
grimjim_llama-3-Nephilim-v3-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/llama-3-Nephilim-v3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/llama-3-Nephilim-v3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/grimjim__llama-3-Nephilim-v3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/llama-3-Nephilim-v3-8B
fd012ba05116aad7dc297d0a866ddb3345a056a1
20.600989
cc-by-nc-4.0
14
8.03
true
false
false
false
1.128179
0.417383
41.738254
0.501267
28.955635
0.095166
9.516616
0.295302
6.040268
0.398927
8.332552
0.361203
29.022606
true
false
2024-07-14
2024-08-26
1
grimjim/llama-3-Nephilim-v3-8B (Merge)
gupta-tanish_llama-7b-dpo-baseline_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gupta-tanish/llama-7b-dpo-baseline" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gupta-tanish/llama-7b-dpo-baseline</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gupta-tanish__llama-7b-dpo-baseline-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gupta-tanish/llama-7b-dpo-baseline
1b5f1ef3ffa3b550619fbf64c33b6fd79e1bd559
11.85729
apache-2.0
0
6.738
true
false
false
false
1.562633
0.269304
26.930433
0.389689
14.380522
0.019637
1.963746
0.262584
1.677852
0.445625
14.769792
0.202793
11.421395
false
false
2024-09-29
2024-09-29
1
gupta-tanish/llama-7b-dpo-baseline (Merge)
gz987_qwen2.5-7b-cabs-v0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/gz987/qwen2.5-7b-cabs-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gz987/qwen2.5-7b-cabs-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gz987__qwen2.5-7b-cabs-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gz987/qwen2.5-7b-cabs-v0.1
5ef7ddd3d7c58504b8bbdee213c37105afade2a9
36.561613
mit
0
7.616
true
false
false
true
0.622445
0.750582
75.058179
0.548158
35.838183
0.479607
47.960725
0.313758
8.501119
0.437625
14.169792
0.440575
37.841681
true
false
2025-02-17
2025-02-18
1
gz987/qwen2.5-7b-cabs-v0.1 (Merge)
gz987_qwen2.5-7b-cabs-v0.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/gz987/qwen2.5-7b-cabs-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gz987/qwen2.5-7b-cabs-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gz987__qwen2.5-7b-cabs-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gz987/qwen2.5-7b-cabs-v0.2
8e5dd3c00616adb80d49f5a83cdd01f1794f7662
36.614019
mit
0
7.616
true
false
false
true
0.651272
0.741764
74.176407
0.551626
36.275907
0.490181
49.018127
0.307047
7.606264
0.442865
14.858073
0.439744
37.749335
true
false
2025-02-18
2025-02-18
1
gz987/qwen2.5-7b-cabs-v0.2 (Merge)
gz987_qwen2.5-7b-cabs-v0.3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/gz987/qwen2.5-7b-cabs-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gz987/qwen2.5-7b-cabs-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gz987__qwen2.5-7b-cabs-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gz987/qwen2.5-7b-cabs-v0.3
2ad9b1f13a2e4dc4d6eed55eb706050c10232188
36.935047
mit
3
7.616
true
false
false
true
0.630818
0.756952
75.695156
0.549447
35.956652
0.493202
49.320242
0.307047
7.606264
0.442958
15.236458
0.44016
37.795508
true
false
2025-02-18
2025-02-18
1
gz987/qwen2.5-7b-cabs-v0.3 (Merge)
gz987_qwen2.5-7b-cabs-v0.4_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/gz987/qwen2.5-7b-cabs-v0.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gz987/qwen2.5-7b-cabs-v0.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/gz987__qwen2.5-7b-cabs-v0.4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gz987/qwen2.5-7b-cabs-v0.4
642e0c1fd2012cd95e7236acbb0c245a60d1f391
36.881946
mit
1
7.616
true
false
false
true
0.629712
0.75825
75.825033
0.55244
36.358438
0.484894
48.489426
0.307886
7.718121
0.442958
15.169792
0.439578
37.730866
true
false
2025-02-18
2025-02-18
1
gz987/qwen2.5-7b-cabs-v0.4 (Merge)
h2oai_h2o-danube-1.8b-chat_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/h2oai/h2o-danube-1.8b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2oai/h2o-danube-1.8b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/h2oai__h2o-danube-1.8b-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2oai/h2o-danube-1.8b-chat
ff4789b36ed0875184b8c67697e434dbd63bb04f
6.953762
apache-2.0
54
1.831
true
false
false
false
0.262545
0.21987
21.986995
0.321966
5.269859
0.013595
1.359517
0.254195
0.559284
0.398865
9.058073
0.1314
3.488845
false
false
2024-01-25
2025-03-05
0
h2oai/h2o-danube-1.8b-chat
h2oai_h2o-danube3-4b-base_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/h2oai/h2o-danube3-4b-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2oai/h2o-danube3-4b-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/h2oai__h2o-danube3-4b-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2oai/h2o-danube3-4b-base
6bdf2f1e317143c998b88d9e9d72facc621a863f
10.090849
apache-2.0
21
3.962
true
false
false
false
0.889004
0.233809
23.380852
0.359908
10.564444
0.022659
2.265861
0.291107
5.480984
0.377813
6.526563
0.210938
12.326389
false
false
2024-07-04
2024-08-10
0
h2oai/h2o-danube3-4b-base
h2oai_h2o-danube3-4b-chat_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/h2oai/h2o-danube3-4b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2oai/h2o-danube3-4b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/h2oai__h2o-danube3-4b-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2oai/h2o-danube3-4b-chat
1e5c6fa6620f8bf078958069ab4581cd88e0202c
11.571247
apache-2.0
66
3.962
true
false
false
true
0.925243
0.362877
36.287717
0.346617
8.839703
0.040785
4.07855
0.260067
1.342282
0.378125
5.232292
0.222822
13.646941
false
false
2024-07-04
2024-07-15
0
h2oai/h2o-danube3-4b-chat
h2oai_h2o-danube3-500m-chat_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/h2oai/h2o-danube3-500m-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2oai/h2o-danube3-500m-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/h2oai__h2o-danube3-500m-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2oai/h2o-danube3-500m-chat
c202f976c26875541e738ea978c8158fa536da9a
5.20444
apache-2.0
34
0.514
true
false
false
true
0.437807
0.220794
22.079416
0.303469
3.06537
0.016616
1.661631
0.230705
0
0.343396
2.824479
0.114362
1.595745
false
false
2024-07-04
2024-10-11
0
h2oai/h2o-danube3-500m-chat
h2oai_h2o-danube3.1-4b-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/h2oai/h2o-danube3.1-4b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2oai/h2o-danube3.1-4b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/h2oai__h2o-danube3.1-4b-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2oai/h2o-danube3.1-4b-chat
e649b5c5844432e0b3e1b1102b6218604e6cbdb8
16.412128
apache-2.0
1
3.962
true
false
false
true
0.598281
0.502112
50.211217
0.360842
10.942063
0.033233
3.323263
0.285235
4.697987
0.410156
10.202865
0.271858
19.095375
false
false
2024-11-29
2024-11-29
0
h2oai/h2o-danube3.1-4b-chat
haoranxu_ALMA-13B-R_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/haoranxu/ALMA-13B-R" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">haoranxu/ALMA-13B-R</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/haoranxu__ALMA-13B-R-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
haoranxu/ALMA-13B-R
b69ebad694274b929cfcf3db29dd7bb93d752e39
3.877302
mit
81
13
true
false
false
false
1.92526
0.003922
0.392182
0.345656
8.819669
0.017372
1.73716
0.25755
1.006711
0.352792
2.232292
0.181682
9.075798
false
false
2024-01-17
2024-10-01
0
haoranxu/ALMA-13B-R
haoranxu_Llama-3-Instruct-8B-CPO-SimPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/haoranxu/Llama-3-Instruct-8B-CPO-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">haoranxu/Llama-3-Instruct-8B-CPO-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/haoranxu__Llama-3-Instruct-8B-CPO-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
haoranxu/Llama-3-Instruct-8B-CPO-SimPO
3ca4b5c3a6395ff090e1039d55ac1f6120777302
24.910737
mit
1
8.03
true
false
false
true
1.49067
0.704645
70.464479
0.50483
29.762188
0.102719
10.271903
0.292785
5.704698
0.356667
3.416667
0.3686
29.844489
false
false
2024-06-19
2024-07-28
0
haoranxu/Llama-3-Instruct-8B-CPO-SimPO
haoranxu_Llama-3-Instruct-8B-SimPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/haoranxu/Llama-3-Instruct-8B-SimPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">haoranxu/Llama-3-Instruct-8B-SimPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/haoranxu__Llama-3-Instruct-8B-SimPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
haoranxu/Llama-3-Instruct-8B-SimPO
8346770280fa169d41d737785dd63a66e9d94501
24.990729
llama3
1
8.03
true
false
false
true
1.159156
0.734745
73.474492
0.497924
28.226376
0.087613
8.761329
0.290268
5.369128
0.356604
3.742188
0.373338
30.370863
false
false
2024-06-07
2024-07-28
1
meta-llama/Meta-Llama-3-8B-Instruct
hatemmahmoud_qwen2.5-1.5b-sft-raft-grpo-hra-doc_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/hatemmahmoud/qwen2.5-1.5b-sft-raft-grpo-hra-doc" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hatemmahmoud/qwen2.5-1.5b-sft-raft-grpo-hra-doc</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/hatemmahmoud__qwen2.5-1.5b-sft-raft-grpo-hra-doc-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hatemmahmoud/qwen2.5-1.5b-sft-raft-grpo-hra-doc
aeb5362a28d3f7718233553bb039cd0ac0ee04e4
18.164353
0
1.544
false
false
false
true
0.615624
0.41958
41.958005
0.426993
19.805227
0.217523
21.752266
0.267617
2.348993
0.360979
3.389063
0.277593
19.732565
false
false
2025-03-07
2025-03-12
0
hatemmahmoud/qwen2.5-1.5b-sft-raft-grpo-hra-doc
hon9kon9ize_CantoneseLLMChat-v0.5_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hon9kon9ize/CantoneseLLMChat-v0.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hon9kon9ize/CantoneseLLMChat-v0.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/hon9kon9ize__CantoneseLLMChat-v0.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hon9kon9ize/CantoneseLLMChat-v0.5
812eb4f168c3ea258ebb220393401db9578e0f67
15.959801
apache-2.0
9
6.069
true
false
false
false
1.667268
0.323085
32.308497
0.434524
20.761385
0.041541
4.154079
0.277685
3.691275
0.470646
18.130729
0.250416
16.71284
false
false
2024-07-01
2024-07-07
0
hon9kon9ize/CantoneseLLMChat-v0.5
hon9kon9ize_CantoneseLLMChat-v1.0-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/hon9kon9ize/CantoneseLLMChat-v1.0-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hon9kon9ize/CantoneseLLMChat-v1.0-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/hon9kon9ize__CantoneseLLMChat-v1.0-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hon9kon9ize/CantoneseLLMChat-v1.0-7B
4703b1afc7aab8e3a8059432fd1c4b0aba011482
23.50387
other
4
7.616
true
false
false
true
3.662793
0.445484
44.548354
0.486573
28.536136
0.210725
21.072508
0.322148
9.619687
0.388292
6.303125
0.378491
30.94341
false
false
2024-10-02
2024-10-10
1
Removed
hongbai12_li-0.4-pre_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/hongbai12/li-0.4-pre" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hongbai12/li-0.4-pre</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/hongbai12__li-0.4-pre-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hongbai12/li-0.4-pre
6b5c78e54a187d3c992fa3033456e63bfaf81349
36.492637
0
14.77
false
false
false
true
1.969252
0.519973
51.997256
0.629827
46.725545
0.492447
49.244713
0.322987
9.731544
0.451302
16.646094
0.501496
44.610668
false
false
2025-03-05
0
Removed
hotmailuser_Deepseek-qwen-modelstock-2B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/hotmailuser/Deepseek-qwen-modelstock-2B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hotmailuser/Deepseek-qwen-modelstock-2B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/hotmailuser__Deepseek-qwen-modelstock-2B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hotmailuser/Deepseek-qwen-modelstock-2B
cf45d578f711d7c11a6b376b03bbcee4159f962a
13.73455
mit
0
1.777
true
false
false
false
1.177681
0.214874
21.487431
0.354924
10.020169
0.339879
33.987915
0.280201
4.026846
0.347458
2.765625
0.191074
10.119311
true
false
2025-01-24
2025-01-26
1
hotmailuser/Deepseek-qwen-modelstock-2B (Merge)
hotmailuser_Falcon3Slerp1-10B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hotmailuser/Falcon3Slerp1-10B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hotmailuser/Falcon3Slerp1-10B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/hotmailuser__Falcon3Slerp1-10B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hotmailuser/Falcon3Slerp1-10B
74d06a0b8e34e4a4c83a3d1e4a4f36b0e548aeaf
31.77668
other
1
10.306
true
false
false
false
1.644068
0.569407
56.940695
0.616985
44.743987
0.259819
25.981873
0.34396
12.527964
0.43176
12.670052
0.44016
37.795508
true
false
2024-12-19
2024-12-23
1
hotmailuser/Falcon3Slerp1-10B (Merge)
hotmailuser_Falcon3Slerp2-10B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hotmailuser/Falcon3Slerp2-10B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hotmailuser/Falcon3Slerp2-10B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/hotmailuser__Falcon3Slerp2-10B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hotmailuser/Falcon3Slerp2-10B
03059d4fe878b69ba65d248f213bc416b4a9e5ed
31.308606
other
0
10.306
true
false
false
false
1.503347
0.611797
61.17967
0.616426
44.54235
0.231873
23.187311
0.338087
11.744966
0.409563
9.761979
0.436918
37.435358
true
false
2024-12-19
2024-12-23
1
hotmailuser/Falcon3Slerp2-10B (Merge)
hotmailuser_Falcon3Slerp4-10B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hotmailuser/Falcon3Slerp4-10B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hotmailuser/Falcon3Slerp4-10B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/hotmailuser__Falcon3Slerp4-10B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hotmailuser/Falcon3Slerp4-10B
43cf3a3669f104492a0e883171da8974cf16b727
30.717493
other
0
10.306
true
false
false
false
1.614832
0.607225
60.72255
0.611434
43.758732
0.228852
22.885196
0.328859
10.514541
0.40175
8.785417
0.438747
37.63852
true
false
2024-12-23
2024-12-23
1
hotmailuser/Falcon3Slerp4-10B (Merge)
hotmailuser_FalconSlerp-3B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hotmailuser/FalconSlerp-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hotmailuser/FalconSlerp-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/hotmailuser__FalconSlerp-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hotmailuser/FalconSlerp-3B
b3d2a778f7188fd56d4539bfd6f18be4baf512c8
22.472736
apache-2.0
0
3.228
true
false
false
false
0.922141
0.569457
56.945682
0.462391
24.394008
0.175982
17.598187
0.287752
5.033557
0.398927
8.999219
0.296792
21.865765
true
false
2025-01-06
2025-01-08
1
hotmailuser/FalconSlerp-3B (Merge)
hotmailuser_FalconSlerp1-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hotmailuser/FalconSlerp1-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hotmailuser/FalconSlerp1-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/hotmailuser__FalconSlerp1-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hotmailuser/FalconSlerp1-7B
3708e7cb80f6309d5cefb68aa0be3fa8f76eb969
28.681198
other
0
7.456
true
false
false
false
1.227822
0.539456
53.945642
0.535468
35.04309
0.237915
23.791541
0.319631
9.284116
0.44525
15.25625
0.412899
34.766548
true
false
2024-12-18
2024-12-23
1
hotmailuser/FalconSlerp1-7B (Merge)
hotmailuser_FalconSlerp2-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hotmailuser/FalconSlerp2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hotmailuser/FalconSlerp2-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/hotmailuser__FalconSlerp2-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hotmailuser/FalconSlerp2-7B
219ee277427f3bc985b82d985f47f8f57fbd5236
31.286856
other
0
7.456
true
false
false
false
1.231664
0.616043
61.604321
0.553781
36.817353
0.298338
29.833837
0.319631
9.284116
0.447885
15.285677
0.414063
34.895833
true
false
2024-12-18
2024-12-23
1
hotmailuser/FalconSlerp2-7B (Merge)
hotmailuser_FalconSlerp3-10B_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hotmailuser/FalconSlerp3-10B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hotmailuser/FalconSlerp3-10B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/hotmailuser__FalconSlerp3-10B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hotmailuser/FalconSlerp3-10B
2752f054efbfee78630fcc54ac7b3366ba778042
30.415116
other
0
10.306
true
false
false
false
1.599303
0.600156
60.015647
0.606029
42.818643
0.227341
22.734139
0.33557
11.409396
0.403083
8.585417
0.432347
36.927453
true
false
2024-12-23
2024-12-23
1
hotmailuser/FalconSlerp3-10B (Merge)
hotmailuser_FalconSlerp3-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hotmailuser/FalconSlerp3-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hotmailuser/FalconSlerp3-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/hotmailuser__FalconSlerp3-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hotmailuser/FalconSlerp3-7B
be57bac375fbc5bc8e5c93b3f9476c4ea20f4585
31.531502
other
0
7.456
true
false
false
false
1.213324
0.609624
60.962358
0.553297
36.834016
0.31571
31.570997
0.318792
9.17226
0.450677
15.901302
0.412733
34.748079
true
false
2024-12-18
2024-12-23
1
hotmailuser/FalconSlerp3-7B (Merge)
hotmailuser_FalconSlerp4-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hotmailuser/FalconSlerp4-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hotmailuser/FalconSlerp4-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/hotmailuser__FalconSlerp4-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hotmailuser/FalconSlerp4-7B
2d16b7120c7b300877b65cae7ee334e9bf28894a
30.512276
0
7.456
false
false
false
false
1.229933
0.628458
62.845805
0.552351
36.468105
0.221299
22.129909
0.332215
10.961969
0.458521
16.981771
0.403175
33.686096
false
false
2025-01-06
2025-01-08
1
hotmailuser/FalconSlerp4-7B (Merge)
hotmailuser_FalconSlerp6-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hotmailuser/FalconSlerp6-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hotmailuser/FalconSlerp6-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/hotmailuser__FalconSlerp6-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hotmailuser/FalconSlerp6-7B
bfe93343c4c409a4be69fa28141ec8d42e3108ff
28.795206
other
0
7.456
true
false
false
false
1.184895
0.602654
60.265429
0.53838
34.478345
0.204683
20.468278
0.317953
9.060403
0.449219
15.21901
0.399518
33.279772
true
false
2025-01-19
2025-01-26
1
hotmailuser/FalconSlerp6-7B (Merge)
hotmailuser_Gemma2Crono-27B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/hotmailuser/Gemma2Crono-27B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hotmailuser/Gemma2Crono-27B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/hotmailuser__Gemma2Crono-27B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hotmailuser/Gemma2Crono-27B
68feccf9af840291c9ce4dea83bdd7b68c351f45
36.288749
apache-2.0
0
27.227
true
false
false
false
7.775828
0.708616
70.861647
0.650534
50.103412
0.242447
24.244713
0.370805
16.107383
0.456687
16.052604
0.463265
40.362736
true
false
2024-12-02
2024-12-02
1
hotmailuser/Gemma2Crono-27B (Merge)
hotmailuser_Gemma2SimPO-27B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/hotmailuser/Gemma2SimPO-27B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hotmailuser/Gemma2SimPO-27B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/hotmailuser__Gemma2SimPO-27B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hotmailuser/Gemma2SimPO-27B
59d5de8216b2b53abcf56a79ebb630d17a856d00
36.432834
0
27.227
false
false
false
false
8.929292
0.72223
72.223035
0.641316
49.159219
0.281722
28.172205
0.358221
14.42953
0.444656
14.148698
0.464179
40.464317
false
false
2024-12-01
0
Removed