eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
64 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.09k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.22
0.83
BBH
float64
0.25
76.7
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.7
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
525 values
Submission Date
stringclasses
263 values
Generation
int64
0
10
Base Model
stringlengths
4
102
JayHyeon_Qwen_0.5-MDPO_0.5_7e-7-3ep_0alp_0lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen_0.5-MDPO_0.5_7e-7-3ep_0alp_0lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen_0.5-MDPO_0.5_7e-7-3ep_0alp_0lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen_0.5-MDPO_0.5_7e-7-3ep_0alp_0lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen_0.5-MDPO_0.5_7e-7-3ep_0alp_0lam
5851868459ddc86358bfdcfd65a811e2127993cd
7.688014
0
0.63
false
false
false
true
1.762856
0.252368
25.236816
0.312969
6.263411
0.044562
4.456193
0.270973
2.796421
0.328854
1.106771
0.156416
6.268469
false
false
2025-01-08
2025-01-10
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen_0.5-MDPO_0.7_3e-6-3ep_0alp_0lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen_0.5-MDPO_0.7_3e-6-3ep_0alp_0lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen_0.5-MDPO_0.7_3e-6-3ep_0alp_0lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen_0.5-MDPO_0.7_3e-6-3ep_0alp_0lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen_0.5-MDPO_0.7_3e-6-3ep_0alp_0lam
6a2bc27b8052769f882501b7403afe5ae5240548
7.816324
0
0.63
false
false
false
true
1.618061
0.251394
25.139408
0.322096
6.770439
0.043807
4.380665
0.275168
3.355705
0.33149
1.269531
0.15384
5.982196
false
false
2025-01-08
2025-01-09
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen_0.5-MDPO_0.7_5e-7-3ep_0alp_0lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen_0.5-MDPO_0.7_5e-7-3ep_0alp_0lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen_0.5-MDPO_0.7_5e-7-3ep_0alp_0lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen_0.5-MDPO_0.7_5e-7-3ep_0alp_0lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen_0.5-MDPO_0.7_5e-7-3ep_0alp_0lam
df437c8128c3d99f6d9d0cffef573b9f0d3fd458
7.460109
0
0.63
false
false
false
true
2.345511
0.245674
24.56737
0.318009
6.636596
0.03852
3.851964
0.266779
2.237136
0.327521
1.106771
0.157247
6.360816
false
false
2025-01-07
2025-01-21
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen_0.5-MDPO_0.9_5e-7-3ep_0alp_0lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen_0.5-MDPO_0.9_5e-7-3ep_0alp_0lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen_0.5-MDPO_0.9_5e-7-3ep_0alp_0lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen_0.5-MDPO_0.9_5e-7-3ep_0alp_0lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen_0.5-MDPO_0.9_5e-7-3ep_0alp_0lam
62a07f38868329115db25dd196c5aa3f03b27fe3
7.877894
0
0.63
false
false
false
true
1.830994
0.263634
26.363382
0.318069
6.534335
0.047583
4.758308
0.26594
2.12528
0.323521
1.106771
0.157414
6.379285
false
false
2025-01-09
2025-01-21
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen_0.5-VDPO_3e-6-1ep_3vpo_const_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen_0.5-VDPO_3e-6-1ep_3vpo_const" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen_0.5-VDPO_3e-6-1ep_3vpo_const</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen_0.5-VDPO_3e-6-1ep_3vpo_const-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen_0.5-VDPO_3e-6-1ep_3vpo_const
86c3f69c05edb7cd30617129825f9c0f1b6cfd47
7.151029
0
0.63
false
false
false
true
0.876748
0.248297
24.829674
0.317431
6.104663
0.037764
3.776435
0.254195
0.559284
0.332792
1.432292
0.155834
6.203827
false
false
2025-02-25
2025-02-27
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen_0.5-VDPO_5e-7-1ep_0alp_0lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen_0.5-VDPO_5e-7-1ep_0alp_0lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen_0.5-VDPO_5e-7-1ep_0alp_0lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen_0.5-VDPO_5e-7-1ep_0alp_0lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen_0.5-VDPO_5e-7-1ep_0alp_0lam
98a1ff10ebf69d5c0a83820fbe962777a7a36295
8.018805
0
0.63
false
false
false
true
1.617092
0.251769
25.176864
0.321802
6.860999
0.05287
5.287009
0.271812
2.908277
0.32349
1.269531
0.159491
6.610151
false
false
2025-01-25
2025-01-25
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen_0.5-VDPO_5e-7-1ep_10vpo_const_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen_0.5-VDPO_5e-7-1ep_10vpo_const" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen_0.5-VDPO_5e-7-1ep_10vpo_const</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen_0.5-VDPO_5e-7-1ep_10vpo_const-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen_0.5-VDPO_5e-7-1ep_10vpo_const
40a33af5a65a4821f73a092060c0fbc26b71de76
8.070118
0
0.63
false
false
false
true
0.798943
0.253617
25.361707
0.323433
7.109441
0.049094
4.909366
0.276007
3.467562
0.323552
0.94401
0.159658
6.62862
false
false
2025-02-20
2025-02-21
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen_0.5-VDPO_5e-7-1ep_1vpo_const_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen_0.5-VDPO_5e-7-1ep_1vpo_const" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen_0.5-VDPO_5e-7-1ep_1vpo_const</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen_0.5-VDPO_5e-7-1ep_1vpo_const-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen_0.5-VDPO_5e-7-1ep_1vpo_const
797fba63d80f5d798b7bfaff04460ab5d3b02190
8.079611
0
0.63
false
false
false
true
0.834673
0.244799
24.479935
0.323953
6.975156
0.060423
6.042296
0.275168
3.355705
0.324854
1.106771
0.15866
6.517804
false
false
2025-02-20
2025-02-21
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen_0.5-VDPO_5e-7-1ep_3vpo_const_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen_0.5-VDPO_5e-7-1ep_3vpo_const" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen_0.5-VDPO_5e-7-1ep_3vpo_const</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen_0.5-VDPO_5e-7-1ep_3vpo_const-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen_0.5-VDPO_5e-7-1ep_3vpo_const
1adc557891a1a092249e7ecdd4500dc3fdb02e83
7.784158
0
0.63
false
false
false
true
0.834472
0.25047
25.046986
0.322699
6.852004
0.046828
4.682779
0.270973
2.796421
0.320917
0.78125
0.15891
6.545508
false
false
2025-02-20
2025-02-21
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen_0.5-VDPO_5e-7-3ep_0alp_0lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen_0.5-VDPO_5e-7-3ep_0alp_0lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen_0.5-VDPO_5e-7-3ep_0alp_0lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen_0.5-VDPO_5e-7-3ep_0alp_0lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen_0.5-VDPO_5e-7-3ep_0alp_0lam
3eb6d75040e859b5ded8987a4d08d168110c2948
8.039736
0
0.63
false
false
false
true
1.658191
0.247197
24.719744
0.325506
7.227976
0.049849
4.984894
0.275168
3.355705
0.320792
1.432292
0.15866
6.517804
false
false
2025-01-25
2025-01-25
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen_0.5-VDPO_5e-7-3ep_1vpo_const_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen_0.5-VDPO_5e-7-3ep_1vpo_const" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen_0.5-VDPO_5e-7-3ep_1vpo_const</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen_0.5-VDPO_5e-7-3ep_1vpo_const-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen_0.5-VDPO_5e-7-3ep_1vpo_const
ee1bdbc1e3d8253a8b0e5993d8a878d5c41f2808
8.004445
0
0.63
false
false
false
true
1.805673
0.241652
24.165215
0.325589
7.343317
0.058157
5.81571
0.272651
3.020134
0.327458
1.432292
0.15625
6.25
false
false
2025-02-08
2025-02-10
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen_0.5-VDPO_5e-7-3ep_3vpo_const_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen_0.5-VDPO_5e-7-3ep_3vpo_const" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen_0.5-VDPO_5e-7-3ep_3vpo_const</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen_0.5-VDPO_5e-7-3ep_3vpo_const-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen_0.5-VDPO_5e-7-3ep_3vpo_const
d2f49ff695a41b6c443b1a211f8c5698eed18e33
8.231694
0
0.63
false
false
false
true
0.962976
0.252693
25.269285
0.323541
7.241752
0.053625
5.362538
0.278523
3.803132
0.32349
1.269531
0.157995
6.443927
false
false
2025-02-08
2025-02-10
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen_0.5-VIPO_5e-7-1ep_0alp_0lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen_0.5-VIPO_5e-7-1ep_0alp_0lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen_0.5-VIPO_5e-7-1ep_0alp_0lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen_0.5-VIPO_5e-7-1ep_0alp_0lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen_0.5-VIPO_5e-7-1ep_0alp_0lam
1935f4c3d5625e9d56de5505323bc705019ce9ae
8.713717
0
0.63
false
false
false
true
1.571816
0.266856
26.685639
0.331374
7.834261
0.070997
7.099698
0.267617
2.348993
0.316823
1.269531
0.163398
7.044178
false
false
2025-01-25
2025-01-25
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen_0.5-VIPO_5e-7-1ep_10vpo_const_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen_0.5-VIPO_5e-7-1ep_10vpo_const" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen_0.5-VIPO_5e-7-1ep_10vpo_const</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen_0.5-VIPO_5e-7-1ep_10vpo_const-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen_0.5-VIPO_5e-7-1ep_10vpo_const
b60f933f70c85d41946b47a3b86150eb4a36e933
8.993053
0
0.63
false
false
false
true
0.849232
0.270229
27.022855
0.32998
7.69224
0.074018
7.401813
0.275168
3.355705
0.320792
1.432292
0.163481
7.053413
false
false
2025-02-20
2025-02-21
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen_0.5-VIPO_5e-7-1ep_1vpo_const_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen_0.5-VIPO_5e-7-1ep_1vpo_const" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen_0.5-VIPO_5e-7-1ep_1vpo_const</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen_0.5-VIPO_5e-7-1ep_1vpo_const-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen_0.5-VIPO_5e-7-1ep_1vpo_const
b75e64467b66f304a1f834f936e17de86950428f
8.292943
0
0.63
false
false
false
true
0.829359
0.248022
24.802192
0.330862
7.776385
0.067976
6.797583
0.264262
1.901566
0.320823
1.269531
0.164894
7.210402
false
false
2025-02-20
2025-02-20
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen_0.5-VIPO_5e-7-1ep_30vpo_const_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen_0.5-VIPO_5e-7-1ep_30vpo_const" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen_0.5-VIPO_5e-7-1ep_30vpo_const</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen_0.5-VIPO_5e-7-1ep_30vpo_const-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen_0.5-VIPO_5e-7-1ep_30vpo_const
32887d1f909914b5c73ae91cc93c61e7b11b8c0a
8.721618
0
0.63
false
false
false
true
0.803219
0.262235
26.223531
0.328199
7.655187
0.074018
7.401813
0.269295
2.572707
0.322125
1.432292
0.163398
7.044178
false
false
2025-02-20
2025-02-21
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen_0.5-VIPO_5e-7-1ep_3vpo_const_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen_0.5-VIPO_5e-7-1ep_3vpo_const" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen_0.5-VIPO_5e-7-1ep_3vpo_const</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen_0.5-VIPO_5e-7-1ep_3vpo_const-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen_0.5-VIPO_5e-7-1ep_3vpo_const
167a76f8c3787035a2304395d51b9eeb3cadfcf3
8.601147
0
0.63
false
false
false
true
0.829118
0.260861
26.086118
0.329802
7.670334
0.064955
6.495468
0.270134
2.684564
0.316792
1.432292
0.165143
7.238106
false
false
2025-02-20
2025-02-21
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen_0.5-VIPO_5e-7-3ep_0alp_0lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen_0.5-VIPO_5e-7-3ep_0alp_0lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen_0.5-VIPO_5e-7-3ep_0alp_0lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen_0.5-VIPO_5e-7-3ep_0alp_0lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen_0.5-VIPO_5e-7-3ep_0alp_0lam
4e7e89a1964d83b08d23f8ed435921ab56bd8269
8.579606
0
0.63
false
false
false
true
1.384664
0.293035
29.30347
0.321955
6.099203
0.062689
6.268882
0.268456
2.46085
0.311583
0.78125
0.159076
6.563978
false
false
2025-01-25
2025-01-25
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen_0.5-VIPO_5e-7-3ep_10vpo_const_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen_0.5-VIPO_5e-7-3ep_10vpo_const" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen_0.5-VIPO_5e-7-3ep_10vpo_const</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen_0.5-VIPO_5e-7-3ep_10vpo_const-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen_0.5-VIPO_5e-7-3ep_10vpo_const
4a601df8da78c41435239c1521526bbf2c7aabb3
8.853043
0
0.63
false
false
false
true
0.653186
0.288139
28.813881
0.325538
6.45427
0.072508
7.250755
0.275168
3.355705
0.31025
0.78125
0.158162
6.462397
false
false
2025-02-06
2025-02-08
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen_0.5-VIPO_5e-7-3ep_1vpo_const_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen_0.5-VIPO_5e-7-3ep_1vpo_const" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen_0.5-VIPO_5e-7-3ep_1vpo_const</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen_0.5-VIPO_5e-7-3ep_1vpo_const-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen_0.5-VIPO_5e-7-3ep_1vpo_const
37188bcb5c994b1f3c5b45d425945ff46326391d
9.001725
0
0.63
false
false
false
true
0.659494
0.288738
28.873833
0.323702
6.083942
0.074773
7.477341
0.280201
4.026846
0.31425
0.78125
0.160904
6.767139
false
false
2025-02-06
2025-02-08
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen_0.5-VIPO_5e-7-3ep_30vpo_const_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen_0.5-VIPO_5e-7-3ep_30vpo_const" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen_0.5-VIPO_5e-7-3ep_30vpo_const</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen_0.5-VIPO_5e-7-3ep_30vpo_const-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen_0.5-VIPO_5e-7-3ep_30vpo_const
8730504c742876706bdd399d57aaa36671b23149
8.944504
0
0.63
false
false
false
true
0.73924
0.290537
29.053689
0.325439
6.616879
0.077039
7.703927
0.27349
3.131991
0.312917
0.78125
0.157414
6.379285
false
false
2025-02-06
2025-02-08
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen_0.5-VIPO_5e-7-3ep_3vpo_const_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen_0.5-VIPO_5e-7-3ep_3vpo_const" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen_0.5-VIPO_5e-7-3ep_3vpo_const</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen_0.5-VIPO_5e-7-3ep_3vpo_const-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen_0.5-VIPO_5e-7-3ep_3vpo_const
4322bfbee423ecf57c561bc3fe821d6a54d8570c
8.730938
0
0.63
false
false
false
true
0.69036
0.290487
29.048702
0.323817
5.989063
0.070242
7.024169
0.27349
3.131991
0.308948
0.61849
0.159159
6.573212
false
false
2025-02-06
2025-02-08
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen_0.5-cDPO_5e-7-3ep_0vpo_const_0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen_0.5-cDPO_5e-7-3ep_0vpo_const_0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen_0.5-cDPO_5e-7-3ep_0vpo_const_0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen_0.5-cDPO_5e-7-3ep_0vpo_const_0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen_0.5-cDPO_5e-7-3ep_0vpo_const_0.1
3c369e3227ae050829233e92ee3238c36490f607
7.905105
0
0.63
false
false
false
true
0.789834
0.239254
23.925407
0.324419
7.201175
0.05136
5.135952
0.277685
3.691275
0.322188
1.106771
0.15733
6.37005
false
false
2025-02-19
2025-02-20
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen_0.5-cDPO_5e-7-3ep_0vpo_const_0.3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen_0.5-cDPO_5e-7-3ep_0vpo_const_0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen_0.5-cDPO_5e-7-3ep_0vpo_const_0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen_0.5-cDPO_5e-7-3ep_0vpo_const_0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen_0.5-cDPO_5e-7-3ep_0vpo_const_0.3
7b50bab8ec188103a31d5bb3ebd3ff5c54a719b7
7.980368
0
0.63
false
false
false
true
0.827034
0.247472
24.747226
0.320906
6.986082
0.046073
4.607251
0.28104
4.138702
0.327521
1.106771
0.156666
6.296173
false
false
2025-02-16
2025-02-17
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen_0.5-rDPO_3e-6-1ep_0vpo_const_0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen_0.5-rDPO_3e-6-1ep_0vpo_const_0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen_0.5-rDPO_3e-6-1ep_0vpo_const_0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen_0.5-rDPO_3e-6-1ep_0vpo_const_0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen_0.5-rDPO_3e-6-1ep_0vpo_const_0.1
e5712d2ba9fbb61af9c9cdd22d264cf16cb7bcdd
7.083621
0
0.63
false
false
false
true
0.767836
0.232135
23.213518
0.327797
6.905189
0.047583
4.758308
0.25755
1.006711
0.302188
1.106771
0.149601
5.511229
false
false
2025-02-26
2025-02-27
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen_0.5-rDPO_5e-7-3ep_0vpo_const_0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen_0.5-rDPO_5e-7-3ep_0vpo_const_0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen_0.5-rDPO_5e-7-3ep_0vpo_const_0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen_0.5-rDPO_5e-7-3ep_0vpo_const_0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen_0.5-rDPO_5e-7-3ep_0vpo_const_0.1
33c2af4b4fd13f9ff5037d0f2f8320a29f8c6cef
8.149324
0
0.63
false
false
false
true
0.794532
0.254167
25.416672
0.325312
7.196411
0.05287
5.287009
0.270973
2.796421
0.318125
1.432292
0.160904
6.767139
false
false
2025-02-18
2025-02-20
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen_0.5-rDPO_5e-7-3ep_0vpo_const_0.3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen_0.5-rDPO_5e-7-3ep_0vpo_const_0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen_0.5-rDPO_5e-7-3ep_0vpo_const_0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen_0.5-rDPO_5e-7-3ep_0vpo_const_0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen_0.5-rDPO_5e-7-3ep_0vpo_const_0.3
bbffb6820cbdf889675c983aa617a3d4a9dbafeb
7.715291
0
0.63
false
false
false
true
0.71595
0.273876
27.387554
0.32451
6.775212
0.046073
4.607251
0.250839
0.111857
0.308917
0.78125
0.159658
6.62862
false
false
2025-02-15
2025-02-17
2
Qwen/Qwen2.5-0.5B
Jimmy19991222_Llama-3-Instruct-8B-SimPO-v0.2_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Jimmy19991222/Llama-3-Instruct-8B-SimPO-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/Llama-3-Instruct-8B-SimPO-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__Llama-3-Instruct-8B-SimPO-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jimmy19991222/Llama-3-Instruct-8B-SimPO-v0.2
53a517ceaef324efc3626be44140b4f18a010591
24.594651
0
8.03
false
false
false
true
1.026303
0.654037
65.403684
0.498371
29.123823
0.061934
6.193353
0.314597
8.612975
0.40125
8.389583
0.3686
29.844489
false
false
2024-09-06
0
Removed
Jimmy19991222_llama-3-8b-instruct-gapo-v2-bert-f1-beta10-gamma0.3-lr1.0e-6-1minus-rerun_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert-f1-beta10-gamma0.3-lr1.0e-6-1minus-rerun" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert-f1-beta10-gamma0.3-lr1.0e-6-1minus-rerun</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__llama-3-8b-instruct-gapo-v2-bert-f1-beta10-gamma0.3-lr1.0e-6-1minus-rerun-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert-f1-beta10-gamma0.3-lr1.0e-6-1minus-rerun
00c02a823b4ff1a6cfcded6085ba9630df633998
24.144995
llama3
0
8.03
true
false
false
true
0.963582
0.671722
67.172214
0.48798
27.755229
0.060423
6.042296
0.294463
5.928412
0.404073
8.709115
0.363364
29.262707
false
false
2024-09-17
2024-09-18
1
meta-llama/Meta-Llama-3-8B-Instruct
Jimmy19991222_llama-3-8b-instruct-gapo-v2-bert_f1-beta10-gamma0.3-lr1.0e-6-scale-log_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert_f1-beta10-gamma0.3-lr1.0e-6-scale-log" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert_f1-beta10-gamma0.3-lr1.0e-6-scale-log</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__llama-3-8b-instruct-gapo-v2-bert_f1-beta10-gamma0.3-lr1.0e-6-scale-log-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert_f1-beta10-gamma0.3-lr1.0e-6-scale-log
99d9e31df5b7e88b1da78b1bd335cac3215dfd6e
24.096149
llama3
0
8.03
true
false
false
true
0.957071
0.655561
65.556058
0.493458
28.613597
0.054381
5.438066
0.30453
7.270694
0.40001
8.167969
0.365775
29.530511
false
false
2024-09-22
2024-09-22
1
meta-llama/Meta-Llama-3-8B-Instruct
Jimmy19991222_llama-3-8b-instruct-gapo-v2-bert_p-beta10-gamma0.3-lr1.0e-6-scale-log_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert_p-beta10-gamma0.3-lr1.0e-6-scale-log" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert_p-beta10-gamma0.3-lr1.0e-6-scale-log</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__llama-3-8b-instruct-gapo-v2-bert_p-beta10-gamma0.3-lr1.0e-6-scale-log-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jimmy19991222/llama-3-8b-instruct-gapo-v2-bert_p-beta10-gamma0.3-lr1.0e-6-scale-log
49a029ea2605d768e89b638ad78a59fd62d192ab
23.037153
llama3
0
8.03
true
false
false
true
1.04497
0.631506
63.150552
0.491641
27.666184
0.064955
6.495468
0.286074
4.809843
0.3935
7.0875
0.36112
29.013372
false
false
2024-09-22
2024-09-22
1
meta-llama/Meta-Llama-3-8B-Instruct
Jimmy19991222_llama-3-8b-instruct-gapo-v2-bleu-beta0.1-no-length-scale-gamma0.4_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Jimmy19991222/llama-3-8b-instruct-gapo-v2-bleu-beta0.1-no-length-scale-gamma0.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/llama-3-8b-instruct-gapo-v2-bleu-beta0.1-no-length-scale-gamma0.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__llama-3-8b-instruct-gapo-v2-bleu-beta0.1-no-length-scale-gamma0.4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jimmy19991222/llama-3-8b-instruct-gapo-v2-bleu-beta0.1-no-length-scale-gamma0.4
de8bb28ad7a9d1158f318a4461dc47ad03e6e560
23.393777
0
8.03
false
false
false
true
0.960742
0.628458
62.845805
0.498609
29.329732
0.05136
5.135952
0.292785
5.704698
0.401375
9.071875
0.354471
28.274601
false
false
2024-09-06
0
Removed
Jimmy19991222_llama-3-8b-instruct-gapo-v2-rouge2-beta10-1minus-gamma0.3-rerun_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Jimmy19991222/llama-3-8b-instruct-gapo-v2-rouge2-beta10-1minus-gamma0.3-rerun" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/llama-3-8b-instruct-gapo-v2-rouge2-beta10-1minus-gamma0.3-rerun</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__llama-3-8b-instruct-gapo-v2-rouge2-beta10-1minus-gamma0.3-rerun-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jimmy19991222/llama-3-8b-instruct-gapo-v2-rouge2-beta10-1minus-gamma0.3-rerun
e9692d8dbe30273839763757aa9ef07a5fcf0c59
24.385612
llama3
0
8.03
true
false
false
true
1.513698
0.66775
66.775046
0.494046
28.390676
0.061178
6.117825
0.306208
7.494407
0.398708
8.005208
0.365775
29.530511
false
false
2024-09-14
2024-09-15
1
meta-llama/Meta-Llama-3-8B-Instruct
Jimmy19991222_llama-3-8b-instruct-gapo-v2-rouge2-beta10-gamma0.3-lr1.0e-6-scale-log_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Jimmy19991222/llama-3-8b-instruct-gapo-v2-rouge2-beta10-gamma0.3-lr1.0e-6-scale-log" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/llama-3-8b-instruct-gapo-v2-rouge2-beta10-gamma0.3-lr1.0e-6-scale-log</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__llama-3-8b-instruct-gapo-v2-rouge2-beta10-gamma0.3-lr1.0e-6-scale-log-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jimmy19991222/llama-3-8b-instruct-gapo-v2-rouge2-beta10-gamma0.3-lr1.0e-6-scale-log
9ff0ce408abb8dbcf7efb9b6533338f2c344a355
24.21085
llama3
0
8.03
true
false
false
true
1.003987
0.660506
66.050635
0.491601
28.075036
0.06571
6.570997
0.303691
7.158837
0.400042
7.805208
0.366439
29.604388
false
false
2024-09-22
2024-09-22
1
meta-llama/Meta-Llama-3-8B-Instruct
Jimmy19991222_llama-3-8b-instruct-gapo-v2-rougeL-beta10-gamma0.3-lr1.0e-6-scale-log_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Jimmy19991222/llama-3-8b-instruct-gapo-v2-rougeL-beta10-gamma0.3-lr1.0e-6-scale-log" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jimmy19991222/llama-3-8b-instruct-gapo-v2-rougeL-beta10-gamma0.3-lr1.0e-6-scale-log</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jimmy19991222__llama-3-8b-instruct-gapo-v2-rougeL-beta10-gamma0.3-lr1.0e-6-scale-log-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jimmy19991222/llama-3-8b-instruct-gapo-v2-rougeL-beta10-gamma0.3-lr1.0e-6-scale-log
ec67f95c4d1813a34bbde52d0ad14824fd7111a0
24.056972
llama3
0
8.03
true
false
false
true
0.973172
0.649191
64.919081
0.495249
28.562567
0.064199
6.41994
0.302013
6.935123
0.396135
7.383594
0.371094
30.121528
false
false
2024-09-22
2024-09-22
1
meta-llama/Meta-Llama-3-8B-Instruct
Joseph717171_Hermes-3-Llama-3.1-8B_TIES_with_Base_Embeds_Initialized_to_Special_Instruct_Toks_dtypeF32_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Joseph717171/Hermes-3-Llama-3.1-8B_TIES_with_Base_Embeds_Initialized_to_Special_Instruct_Toks_dtypeF32" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Joseph717171/Hermes-3-Llama-3.1-8B_TIES_with_Base_Embeds_Initialized_to_Special_Instruct_Toks_dtypeF32</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Joseph717171__Hermes-3-Llama-3.1-8B_TIES_with_Base_Embeds_Initialized_to_Special_Instruct_Toks_dtypeF32-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Joseph717171/Hermes-3-Llama-3.1-8B_TIES_with_Base_Embeds_Initialized_to_Special_Instruct_Toks_dtypeF32
823930851c57b11fd2e25cd65b5c53f909209d0e
23.252877
llama3.1
1
8.03
true
false
false
true
0.707545
0.618541
61.854103
0.517745
30.724097
0.05136
5.135952
0.282718
4.362416
0.436938
13.617187
0.314412
23.823508
true
false
2024-10-23
2024-10-25
0
Joseph717171/Hermes-3-Llama-3.1-8B_TIES_with_Base_Embeds_Initialized_to_Special_Instruct_Toks_dtypeF32
Joseph717171_Llama-3.1-SuperNova-8B-Lite_TIES_with_Base_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Joseph717171/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Joseph717171/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Joseph717171__Llama-3.1-SuperNova-8B-Lite_TIES_with_Base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Joseph717171/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base
f1e2cad4dca10f948fd2ee9588f80df0b40d7232
30.245028
llama3.1
8
8.03
true
false
false
true
1.749463
0.809633
80.963289
0.514742
31.465813
0.183535
18.353474
0.309564
7.941834
0.41099
10.740365
0.388049
32.005393
true
false
2024-10-02
2024-10-03
0
Joseph717171/Llama-3.1-SuperNova-8B-Lite_TIES_with_Base
Josephgflowers_Cinder-Phi-2-V1-F16-gguf_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/Josephgflowers/Cinder-Phi-2-V1-F16-gguf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Josephgflowers/Cinder-Phi-2-V1-F16-gguf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Josephgflowers__Cinder-Phi-2-V1-F16-gguf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Josephgflowers/Cinder-Phi-2-V1-F16-gguf
85629ec9b18efee31d07630664e7a3815121badf
11.258523
mit
4
2.78
true
false
false
true
0.942807
0.235657
23.565695
0.439662
22.453402
0.024169
2.416918
0.281879
4.250559
0.343458
1.965625
0.21609
12.898936
false
false
2024-02-25
2024-06-26
0
Josephgflowers/Cinder-Phi-2-V1-F16-gguf
Josephgflowers_Differential-Attention-Liquid-Metal-Tinyllama_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Josephgflowers/Differential-Attention-Liquid-Metal-Tinyllama" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Josephgflowers/Differential-Attention-Liquid-Metal-Tinyllama</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Josephgflowers__Differential-Attention-Liquid-Metal-Tinyllama-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Josephgflowers/Differential-Attention-Liquid-Metal-Tinyllama
bdb6c63ff1025241e8e10b1858d67dc410f0a702
5.25096
mit
3
1.1
true
false
false
true
0.347587
0.222692
22.269246
0.292556
2.552224
0.032477
3.247734
0.250839
0.111857
0.335552
0.94401
0.121426
2.380689
false
false
2024-11-05
2024-11-07
0
Josephgflowers/Differential-Attention-Liquid-Metal-Tinyllama
Josephgflowers_TinyLlama-Cinder-Agent-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Josephgflowers/TinyLlama-Cinder-Agent-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Josephgflowers/TinyLlama-Cinder-Agent-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Josephgflowers__TinyLlama-Cinder-Agent-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Josephgflowers/TinyLlama-Cinder-Agent-v1
a9cd8b48bfe30f29bb1f819213da9a4c41eee67f
6.332677
mit
2
1.1
true
false
false
true
0.475663
0.266956
26.695612
0.311604
3.804167
0.034743
3.47432
0.244128
0
0.339458
2.232292
0.116107
1.789672
false
false
2024-05-21
2024-06-26
4
Josephgflowers/TinyLlama-3T-Cinder-v1.2
Josephgflowers_TinyLlama-v1.1-Cinders-World_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Josephgflowers/TinyLlama-v1.1-Cinders-World" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Josephgflowers/TinyLlama-v1.1-Cinders-World</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Josephgflowers__TinyLlama-v1.1-Cinders-World-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Josephgflowers/TinyLlama-v1.1-Cinders-World
11a2c305f787a7908dd87c4e5a7d0f1e314a1f05
5.683003
mit
0
1.1
true
false
false
true
0.514767
0.246923
24.692261
0.299797
3.107714
0.034743
3.47432
0.244128
0
0.335615
0.61849
0.119847
2.20523
false
false
2024-10-12
2024-10-13
0
Josephgflowers/TinyLlama-v1.1-Cinders-World
Josephgflowers_TinyLlama_v1.1_math_code-world-test-1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Josephgflowers/TinyLlama_v1.1_math_code-world-test-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Josephgflowers/TinyLlama_v1.1_math_code-world-test-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Josephgflowers__TinyLlama_v1.1_math_code-world-test-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Josephgflowers/TinyLlama_v1.1_math_code-world-test-1
6f7c2aaf0b8723bc6a1dc23a4a1ff0ec24dc11ec
2.002811
mit
0
1.1
true
false
false
false
0.545888
0.007844
0.784363
0.314635
4.164017
0.019637
1.963746
0.23406
0
0.349906
3.638281
0.113198
1.46646
false
false
2024-06-23
2024-09-09
0
Josephgflowers/TinyLlama_v1.1_math_code-world-test-1
Josephgflowers_Tinyllama-STEM-Cinder-Agent-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Josephgflowers/Tinyllama-STEM-Cinder-Agent-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Josephgflowers/Tinyllama-STEM-Cinder-Agent-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Josephgflowers__Tinyllama-STEM-Cinder-Agent-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Josephgflowers/Tinyllama-STEM-Cinder-Agent-v1
c6880b94e72dddbe591fdf30fa15fe42ea60b924
5.683635
mit
0
1.1
true
false
false
true
0.326771
0.212576
21.257597
0.308438
3.731313
0.067221
6.722054
0.234899
0
0.334125
1.432292
0.108627
0.958555
false
false
2024-11-27
2024-11-27
1
Josephgflowers/Tinyllama-STEM-Cinder-Agent-v1 (Merge)
Josephgflowers_Tinyllama-r1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Josephgflowers/Tinyllama-r1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Josephgflowers/Tinyllama-r1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Josephgflowers__Tinyllama-r1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Josephgflowers/Tinyllama-r1
01af42eb435005a6103760b3f85549ad5e5c35dc
5.217267
llama3.3
2
1.1
true
false
false
true
0.214772
0.211927
21.192658
0.301463
3.204659
0.032477
3.247734
0.256711
0.894855
0.33149
1.269531
0.113447
1.494164
false
false
2025-02-07
2025-02-09
1
Josephgflowers/Tinyllama-r1 (Merge)
JungZoona_T3Q-Qwen2.5-14B-Instruct-1M-e3_bfloat16
bfloat16
❓ other
Original
Unknown
<a target="_blank" href="https://huggingface.co/JungZoona/T3Q-Qwen2.5-14B-Instruct-1M-e3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JungZoona/T3Q-Qwen2.5-14B-Instruct-1M-e3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JungZoona__T3Q-Qwen2.5-14B-Instruct-1M-e3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JungZoona/T3Q-Qwen2.5-14B-Instruct-1M-e3
5362b0e623096cf6667a47cefd2b33e2e3dd37a9
47.091545
apache-2.0
9
0
true
false
false
false
1.397414
0.732397
73.239671
0.758597
65.466597
0.286254
28.625378
0.416946
22.259508
0.591104
38.688021
0.588431
54.270095
false
false
2025-03-09
2
Qwen/Qwen2.5-14B
JungZoona_T3Q-qwen2.5-14b-v1.0-e3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JungZoona/T3Q-qwen2.5-14b-v1.0-e3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JungZoona/T3Q-qwen2.5-14b-v1.0-e3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JungZoona__T3Q-qwen2.5-14b-v1.0-e3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JungZoona/T3Q-qwen2.5-14b-v1.0-e3
76b2538f6a0646f8c507f9c9ab070030d3c1b90c
47.091545
apache-2.0
9
14.77
true
false
false
true
1.561938
0.732397
73.239671
0.758597
65.466597
0.286254
28.625378
0.416946
22.259508
0.591104
38.688021
0.588431
54.270095
false
false
2025-03-09
2025-03-13
2
Qwen/Qwen2.5-14B
Junhoee_Qwen-Megumin_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/Junhoee/Qwen-Megumin" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Junhoee/Qwen-Megumin</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Junhoee__Qwen-Megumin-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Junhoee/Qwen-Megumin
bb46c15ee4bb56c5b63245ef50fd7637234d6f75
33.99218
1
15.231
false
false
false
true
2.840298
0.714112
71.411189
0.528527
33.642144
0.490181
49.018127
0.296141
6.152125
0.398031
8.18724
0.41988
35.542258
false
false
2024-11-26
2024-11-26
2
Qwen/Qwen2.5-7B
KSU-HW-SEC_Llama3-70b-SVA-FT-1415_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/KSU-HW-SEC/Llama3-70b-SVA-FT-1415" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">KSU-HW-SEC/Llama3-70b-SVA-FT-1415</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/KSU-HW-SEC__Llama3-70b-SVA-FT-1415-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
KSU-HW-SEC/Llama3-70b-SVA-FT-1415
1c09728455567898116d2d9cfb6cbbbbd4ee730c
36.119233
0
70.554
false
false
false
false
19.202058
0.617991
61.799137
0.665015
51.328741
0.219789
21.978852
0.375
16.666667
0.456542
17.801042
0.524269
47.140957
false
false
2024-09-08
0
Removed
KSU-HW-SEC_Llama3-70b-SVA-FT-500_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/KSU-HW-SEC/Llama3-70b-SVA-FT-500" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">KSU-HW-SEC/Llama3-70b-SVA-FT-500</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/KSU-HW-SEC__Llama3-70b-SVA-FT-500-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
KSU-HW-SEC/Llama3-70b-SVA-FT-500
856a23f28aeada23d1135c86a37e05524307e8ed
35.953712
0
70.554
false
false
false
false
18.947476
0.610522
61.05223
0.669224
51.887026
0.213746
21.374622
0.380872
17.449664
0.451146
16.993229
0.522689
46.965499
false
false
2024-09-08
0
Removed
KSU-HW-SEC_Llama3-70b-SVA-FT-final_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/KSU-HW-SEC/Llama3-70b-SVA-FT-final" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">KSU-HW-SEC/Llama3-70b-SVA-FT-final</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/KSU-HW-SEC__Llama3-70b-SVA-FT-final-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
KSU-HW-SEC/Llama3-70b-SVA-FT-final
391bbd94173b34975d1aa2c7356977a630253b75
36.093837
0
70.554
false
false
false
false
19.312398
0.616468
61.646764
0.665015
51.328741
0.219789
21.978852
0.375
16.666667
0.456542
17.801042
0.524269
47.140957
false
false
2024-09-08
0
Removed
KSU-HW-SEC_Llama3.1-70b-SVA-FT-1000step_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/KSU-HW-SEC/Llama3.1-70b-SVA-FT-1000step" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">KSU-HW-SEC/Llama3.1-70b-SVA-FT-1000step</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/KSU-HW-SEC__Llama3.1-70b-SVA-FT-1000step-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
KSU-HW-SEC/Llama3.1-70b-SVA-FT-1000step
b195fea0d8f350ff29243d4e88654b1baa5af79e
40.750259
0
70.554
false
false
false
false
25.108894
0.723804
72.380395
0.690312
55.485365
0.320997
32.099698
0.395973
19.463087
0.459177
17.830469
0.525183
47.242538
false
false
2024-09-08
0
Removed
Khetterman_DarkAtom-12B-v3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Khetterman/DarkAtom-12B-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Khetterman/DarkAtom-12B-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Khetterman__DarkAtom-12B-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Khetterman/DarkAtom-12B-v3
7c7dacc560b64dcff96121ea99374794ccd64b7c
25.880383
15
12.248
false
false
false
true
2.106568
0.617342
61.734199
0.515371
31.659542
0.111027
11.102719
0.297819
6.375839
0.446802
16.116927
0.354638
28.29307
false
false
2024-11-13
2024-12-09
1
Khetterman/DarkAtom-12B-v3 (Merge)
Khetterman_Kosmos-8B-v1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Khetterman/Kosmos-8B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Khetterman/Kosmos-8B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Khetterman__Kosmos-8B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Khetterman/Kosmos-8B-v1
16ad5242ca89c6901fae1f41033f00ce455f4be0
21.318471
4
8.03
false
false
false
false
1.258897
0.412911
41.291108
0.523352
31.758959
0.098943
9.89426
0.298658
6.487696
0.391885
8.81901
0.366938
29.659796
false
false
2024-11-22
2024-12-27
1
Khetterman/Kosmos-8B-v1 (Merge)
Kimargin_GPT-NEO-1.3B-wiki_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoForCausalLM
<a target="_blank" href="https://huggingface.co/Kimargin/GPT-NEO-1.3B-wiki" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Kimargin/GPT-NEO-1.3B-wiki</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Kimargin__GPT-NEO-1.3B-wiki-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Kimargin/GPT-NEO-1.3B-wiki
92fa51fa6589f6e8fdfcc83f085216b3dae11da5
5.349183
apache-2.0
1
1.316
true
false
false
false
1.248672
0.192068
19.206816
0.302634
3.423612
0.01435
1.435045
0.244966
0
0.38826
6.932552
0.109874
1.097074
false
false
2024-10-23
2024-10-24
1
Kimargin/GPT-NEO-1.3B-wiki (Merge)
KingNish_Qwen2.5-0.5b-Test-ft_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/KingNish/Qwen2.5-0.5b-Test-ft" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">KingNish/Qwen2.5-0.5b-Test-ft</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/KingNish__Qwen2.5-0.5b-Test-ft-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
KingNish/Qwen2.5-0.5b-Test-ft
f905bb1d37c7853fb5c7157d8d3ad0f062b65c0f
7.865416
apache-2.0
10
0.494
true
false
false
false
1.337381
0.267081
26.708134
0.323153
6.058845
0.035498
3.549849
0.263423
1.789709
0.342125
1.432292
0.168883
7.653664
false
false
2024-09-26
2024-09-29
1
KingNish/Qwen2.5-0.5b-Test-ft (Merge)
KingNish_Reasoning-0.5b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/KingNish/Reasoning-0.5b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">KingNish/Reasoning-0.5b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/KingNish__Reasoning-0.5b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
KingNish/Reasoning-0.5b
fca9019dec693bfcb8a1fbc39e301636ae2c518d
7.163893
apache-2.0
30
0.494
true
false
false
true
0.511241
0.217422
21.7422
0.335363
7.491211
0.021903
2.190332
0.267617
2.348993
0.351333
2.083333
0.164146
7.12729
false
false
2024-10-05
2025-03-07
2
Qwen/Qwen2.5-0.5B
KingNish_Reasoning-Llama-3b-v0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/KingNish/Reasoning-Llama-3b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">KingNish/Reasoning-Llama-3b-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/KingNish__Reasoning-Llama-3b-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
KingNish/Reasoning-Llama-3b-v0.1
d164caf591c42a4cbc3b21d46493e72fbdbd9de8
20.21238
llama3.2
9
3.213
true
false
false
true
1.35047
0.622463
62.246284
0.434336
19.862451
0.129909
12.990937
0.259228
1.230425
0.31676
2.395052
0.302942
22.549128
false
false
2024-10-10
2024-10-26
1
meta-llama/Llama-3.2-3B-Instruct
KingNish_qwen-1b-continued_float16
float16
🟩 continuously pretrained
🟩
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/KingNish/qwen-1b-continued" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">KingNish/qwen-1b-continued</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/KingNish__qwen-1b-continued-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
KingNish/qwen-1b-continued
4abdfa59671b7c23c535aca87ce15baef8ed1125
4.7926
apache-2.0
0
1.277
true
false
false
false
0.891073
0.125473
12.547263
0.299095
4.387464
0.009063
0.906344
0.267617
2.348993
0.385875
5.667708
0.12608
2.897828
false
false
2025-03-07
2025-03-07
1
Removed
KingNish_qwen-1b-continued-v2_float16
float16
🟩 continuously pretrained
🟩
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/KingNish/qwen-1b-continued-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">KingNish/qwen-1b-continued-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/KingNish__qwen-1b-continued-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
KingNish/qwen-1b-continued-v2
bfa7ea0a2675dc4acf890d6ca5e3c218315e017c
4.441642
apache-2.0
0
1.277
true
false
false
false
0.911487
0.157871
15.787112
0.311949
4.989232
0.010574
1.057402
0.25
0
0.339271
2.675521
0.119265
2.140588
false
false
2025-03-07
2025-03-07
1
Removed
KingNish_qwen-1b-continued-v2.1_float16
float16
🟩 continuously pretrained
🟩
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/KingNish/qwen-1b-continued-v2.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">KingNish/qwen-1b-continued-v2.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/KingNish__qwen-1b-continued-v2.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
KingNish/qwen-1b-continued-v2.1
b1f1624062c683f84815bdeead038c0b2cf2c884
5.461815
apache-2.0
0
1.277
true
false
false
false
0.890406
0.112683
11.268324
0.304166
4.197658
0.009063
0.906344
0.267617
2.348993
0.415396
10.957813
0.127826
3.091755
false
false
2025-03-08
2025-03-08
2
Removed
KingNish_qwen-1b-continued-v2.2_float16
float16
🟩 continuously pretrained
🟩
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/KingNish/qwen-1b-continued-v2.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">KingNish/qwen-1b-continued-v2.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/KingNish__qwen-1b-continued-v2.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
KingNish/qwen-1b-continued-v2.2
8afa5b9fb92d599500e9043f4219de945b890839
4.441642
apache-2.0
0
1.277
true
false
false
false
0.903814
0.14126
14.125964
0.305866
4.956069
0.015106
1.510574
0.256711
0.894855
0.351302
2.246094
0.126247
2.916297
false
false
2025-03-08
2025-03-09
3
Removed
Kquant03_CognitiveFusion2-4x7B-BF16_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/Kquant03/CognitiveFusion2-4x7B-BF16" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Kquant03/CognitiveFusion2-4x7B-BF16</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Kquant03__CognitiveFusion2-4x7B-BF16-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Kquant03/CognitiveFusion2-4x7B-BF16
db45b86c462bb93db7ba4f2c3fe3517582c859a1
15.629055
apache-2.0
3
24.154
true
true
false
true
3.332071
0.356657
35.6657
0.410783
17.689003
0.057402
5.740181
0.286074
4.809843
0.414552
9.952344
0.279255
19.917258
true
false
2024-04-06
2024-07-31
0
Kquant03/CognitiveFusion2-4x7B-BF16
Kquant03_L3-Pneuma-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Kquant03/L3-Pneuma-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Kquant03/L3-Pneuma-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Kquant03__L3-Pneuma-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Kquant03/L3-Pneuma-8B
257aa8d00e82f91b7a780384aa76573c2ea614a8
16.61757
llama3
1
8.03
true
false
false
false
1.607622
0.237406
23.740564
0.495504
28.820202
0.050604
5.060423
0.307047
7.606264
0.417156
10.211198
0.318401
24.26677
false
false
2024-10-13
2024-10-16
1
meta-llama/Meta-Llama-3-8B
Krystalan_DRT-o1-14B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Krystalan/DRT-o1-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Krystalan/DRT-o1-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Krystalan__DRT-o1-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Krystalan/DRT-o1-14B
b89415f3ceb805687dc75d0ac82d1425e497bcaa
36.166086
cc-by-nc-sa-4.0
22
14.77
true
false
false
false
3.745981
0.406766
40.676627
0.637928
48.141822
0.482628
48.26284
0.352349
13.646532
0.47951
19.838802
0.517869
46.429891
false
false
2024-12-23
2024-12-27
1
Krystalan/DRT-o1-14B (Merge)
Krystalan_DRT-o1-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Krystalan/DRT-o1-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Krystalan/DRT-o1-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Krystalan__DRT-o1-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Krystalan/DRT-o1-7B
5e1848ded3209d414113c1ded1389cd04aef99c2
31.402662
cc-by-nc-sa-4.0
13
7.616
true
false
false
false
1.321964
0.392828
39.28277
0.546769
35.738937
0.447885
44.78852
0.321309
9.50783
0.508656
24.082031
0.415143
35.015884
false
false
2024-12-23
2025-02-05
1
Krystalan/DRT-o1-7B (Merge)
Kukedlc_NeuralExperiment-7b-MagicCoder-v7.5_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Kukedlc/NeuralExperiment-7b-MagicCoder-v7.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Kukedlc/NeuralExperiment-7b-MagicCoder-v7.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Kukedlc__NeuralExperiment-7b-MagicCoder-v7.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Kukedlc/NeuralExperiment-7b-MagicCoder-v7.5
43ea8d27d652dc15e4d27f665c5d636a5937780b
18.006005
apache-2.0
6
7.242
true
false
false
true
0.90972
0.455251
45.525096
0.398845
16.386034
0.066465
6.646526
0.296141
6.152125
0.428198
13.058073
0.282414
20.268174
false
false
2024-03-07
2024-07-30
0
Kukedlc/NeuralExperiment-7b-MagicCoder-v7.5
Kukedlc_NeuralLLaMa-3-8b-DT-v0.1_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Kukedlc/NeuralLLaMa-3-8b-DT-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Kukedlc/NeuralLLaMa-3-8b-DT-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Kukedlc__NeuralLLaMa-3-8b-DT-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Kukedlc/NeuralLLaMa-3-8b-DT-v0.1
1fe849c1e7e4793c2fdd869fcfb51e0d1910674f
21.259599
other
1
8.03
true
false
false
false
1.705928
0.437141
43.714123
0.498677
28.008308
0.080816
8.081571
0.302852
7.04698
0.407115
9.689323
0.379156
31.017287
true
false
2024-05-11
2024-09-17
1
Kukedlc/NeuralLLaMa-3-8b-DT-v0.1 (Merge)
Kukedlc_NeuralLLaMa-3-8b-ORPO-v0.3_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Kukedlc/NeuralLLaMa-3-8b-ORPO-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Kukedlc/NeuralLLaMa-3-8b-ORPO-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Kukedlc__NeuralLLaMa-3-8b-ORPO-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Kukedlc/NeuralLLaMa-3-8b-ORPO-v0.3
aa176c0db7791a1c09039135791145b0704a5f46
17.748742
apache-2.0
2
8.03
true
false
false
true
1.83102
0.527591
52.759124
0.455714
22.391712
0.048338
4.833837
0.239094
0
0.370031
3.653906
0.305685
22.853871
false
false
2024-05-14
2024-07-28
1
Kukedlc/NeuralLLaMa-3-8b-ORPO-v0.3 (Merge)
Kukedlc_NeuralSynthesis-7B-v0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Kukedlc/NeuralSynthesis-7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Kukedlc/NeuralSynthesis-7B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Kukedlc__NeuralSynthesis-7B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Kukedlc/NeuralSynthesis-7B-v0.1
547a5dc8963e127a9638256bb80eb3a36da1cc5d
20.015677
apache-2.0
3
7.242
true
false
false
false
1.192598
0.418456
41.845636
0.514475
31.834395
0.063444
6.344411
0.28104
4.138702
0.433281
13.160156
0.304937
22.770759
true
false
2024-04-06
2024-06-29
0
Kukedlc/NeuralSynthesis-7B-v0.1
Kukedlc_NeuralSynthesis-7B-v0.3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Kukedlc/NeuralSynthesis-7B-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Kukedlc/NeuralSynthesis-7B-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Kukedlc__NeuralSynthesis-7B-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Kukedlc/NeuralSynthesis-7B-v0.3
090fab29146f8e55066bce2f5f5859ab2d6027f4
20.095273
apache-2.0
0
7.242
true
false
false
false
1.167144
0.40784
40.784009
0.513808
31.811748
0.077795
7.779456
0.280201
4.026846
0.434583
13.389583
0.30502
22.779994
true
false
2024-04-07
2024-07-31
0
Kukedlc/NeuralSynthesis-7B-v0.3
Kukedlc_NeuralSynthesis-7b-v0.4-slerp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Kukedlc/NeuralSynthesis-7b-v0.4-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Kukedlc/NeuralSynthesis-7b-v0.4-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Kukedlc__NeuralSynthesis-7b-v0.4-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Kukedlc/NeuralSynthesis-7b-v0.4-slerp
bb3bd36fce162f472668dbd91960cd1525b45f30
19.530513
apache-2.0
0
7.242
true
false
false
false
1.198129
0.394726
39.472599
0.514293
31.997187
0.062689
6.268882
0.277685
3.691275
0.43325
13.05625
0.304272
22.696882
true
false
2024-04-12
2024-07-31
1
Kukedlc/NeuralSynthesis-7b-v0.4-slerp (Merge)
Kukedlc_Qwen-2.5-7b-Spanish-o1-CoT_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Kukedlc/Qwen-2.5-7b-Spanish-o1-CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Kukedlc/Qwen-2.5-7b-Spanish-o1-CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Kukedlc__Qwen-2.5-7b-Spanish-o1-CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Kukedlc/Qwen-2.5-7b-Spanish-o1-CoT
71d8ed0634e20921a761a0e349dca9eb86b63f82
28.573639
apache-2.0
2
7.616
true
false
false
false
1.273072
0.42103
42.102953
0.560195
36.863365
0.272659
27.265861
0.32047
9.395973
0.477677
18.442969
0.436336
37.370715
false
false
2024-12-03
2024-12-04
0
Kukedlc/Qwen-2.5-7b-Spanish-o1-CoT
Kumar955_Hemanth-llm_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Kumar955/Hemanth-llm" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Kumar955/Hemanth-llm</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Kumar955__Hemanth-llm-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Kumar955/Hemanth-llm
871325cc04f57cd953c161a0ace49c47af8eca4c
22.143018
0
7.242
false
false
false
false
2.672247
0.50451
50.451026
0.522495
33.044262
0.070242
7.024169
0.282718
4.362416
0.448563
14.503646
0.311253
23.472592
false
false
2024-09-24
2024-09-24
1
Kumar955/Hemanth-llm (Merge)
L-RAGE_3_PRYMMAL-ECE-7B-SLERP-V1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/L-RAGE/3_PRYMMAL-ECE-7B-SLERP-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">L-RAGE/3_PRYMMAL-ECE-7B-SLERP-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/L-RAGE__3_PRYMMAL-ECE-7B-SLERP-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
L-RAGE/3_PRYMMAL-ECE-7B-SLERP-V1
483902db68f99affe1d7f1139755dfd115abbca5
14.854317
apache-2.0
0
1.777
true
false
false
false
1.179557
0.274226
27.422572
0.422794
19.083009
0.108006
10.800604
0.281879
4.250559
0.384135
6.183594
0.29247
21.385564
true
false
2024-10-29
2024-10-29
1
L-RAGE/3_PRYMMAL-ECE-7B-SLERP-V1 (Merge)
LEESM_llama-2-7b-hf-lora-oki100p_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/LEESM/llama-2-7b-hf-lora-oki100p" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LEESM/llama-2-7b-hf-lora-oki100p</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LEESM__llama-2-7b-hf-lora-oki100p-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LEESM/llama-2-7b-hf-lora-oki100p
4bfd99888bf37e23d966f1e537fe199992c27a72
8.782859
mit
2
6.738
true
false
false
false
0.967635
0.251294
25.129434
0.349168
10.265743
0.016616
1.661631
0.269295
2.572707
0.368729
3.557813
0.185588
9.509826
false
false
2024-07-17
2024-11-08
0
LEESM/llama-2-7b-hf-lora-oki100p
LEESM_llama-2-7b-hf-lora-oki10p_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/LEESM/llama-2-7b-hf-lora-oki10p" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LEESM/llama-2-7b-hf-lora-oki10p</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LEESM__llama-2-7b-hf-lora-oki10p-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LEESM/llama-2-7b-hf-lora-oki10p
d6e5af01616a038ac2b5cb83f458e490e1102244
7.168376
mit
0
6.738
true
false
false
false
1.47553
0.227014
22.701432
0.353093
9.438287
0.016616
1.661631
0.254195
0.559284
0.347521
1.106771
0.167886
7.542849
false
false
2024-04-03
2024-11-08
0
LEESM/llama-2-7b-hf-lora-oki10p
LEESM_llama-3-8b-bnb-4b-kowiki231101_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/LEESM/llama-3-8b-bnb-4b-kowiki231101" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LEESM/llama-3-8b-bnb-4b-kowiki231101</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LEESM__llama-3-8b-bnb-4b-kowiki231101-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LEESM/llama-3-8b-bnb-4b-kowiki231101
63b8f715daab6a0c7196a20855be8e85fe7ddcb4
9.472498
apache-2.0
0
8.03
true
false
false
false
1.513775
0.168487
16.848739
0.413081
16.934868
0.013595
1.359517
0.270973
2.796421
0.355146
3.059896
0.24252
15.83555
false
false
2024-11-08
2024-11-08
2
meta-llama/Meta-Llama-3.1-8B
LEESM_llama-3-Korean-Bllossom-8B-trexlab-oki10p_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/LEESM/llama-3-Korean-Bllossom-8B-trexlab-oki10p" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LEESM/llama-3-Korean-Bllossom-8B-trexlab-oki10p</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LEESM__llama-3-Korean-Bllossom-8B-trexlab-oki10p-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LEESM/llama-3-Korean-Bllossom-8B-trexlab-oki10p
d105e0365510f9e5f8550558343083cab8523524
13.509663
mit
0
8.03
true
false
false
false
1.516716
0.213725
21.372514
0.434301
19.797436
0.046828
4.682779
0.275168
3.355705
0.386927
7.665885
0.317653
24.183658
false
false
2024-07-22
2024-11-08
0
LEESM/llama-3-Korean-Bllossom-8B-trexlab-oki10p
LGAI-EXAONE_EXAONE-3.0-7.8B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
ExaoneForCausalLM
<a target="_blank" href="https://huggingface.co/LGAI-EXAONE/EXAONE-3.0-7.8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LGAI-EXAONE/EXAONE-3.0-7.8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LGAI-EXAONE__EXAONE-3.0-7.8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LGAI-EXAONE/EXAONE-3.0-7.8B-Instruct
7f15baedd46858153d817445aff032f4d6cf4939
25.733776
other
407
7.8
true
false
false
true
1.650256
0.719283
71.928261
0.417443
17.977335
0.304381
30.438066
0.26594
2.12528
0.366125
3.298958
0.357713
28.634752
false
false
2024-07-31
2024-08-18
0
LGAI-EXAONE/EXAONE-3.0-7.8B-Instruct
LGAI-EXAONE_EXAONE-3.5-2.4B-Instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
ExaoneForCausalLM
<a target="_blank" href="https://huggingface.co/LGAI-EXAONE/EXAONE-3.5-2.4B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LGAI-EXAONE/EXAONE-3.5-2.4B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LGAI-EXAONE__EXAONE-3.5-2.4B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LGAI-EXAONE/EXAONE-3.5-2.4B-Instruct
e949c91dec92095908d34e6b560af77dd0c993f8
27.143883
other
143
2.405
true
false
false
true
1.214543
0.795045
79.504493
0.409235
15.947437
0.367825
36.782477
0.26594
2.12528
0.366125
3.165625
0.328042
25.337988
false
false
2024-12-01
2024-12-11
0
LGAI-EXAONE/EXAONE-3.5-2.4B-Instruct
LGAI-EXAONE_EXAONE-3.5-32B-Instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
ExaoneForCausalLM
<a target="_blank" href="https://huggingface.co/LGAI-EXAONE/EXAONE-3.5-32B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LGAI-EXAONE/EXAONE-3.5-32B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LGAI-EXAONE__EXAONE-3.5-32B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LGAI-EXAONE/EXAONE-3.5-32B-Instruct
d6fa88cd8d2c9512b40578bdc44e64909e5a5042
37.603166
other
113
32.003
true
false
false
true
30.995242
0.839183
83.918337
0.576091
39.824203
0.51284
51.283988
0.287752
5.033557
0.380667
5.15
0.46368
40.40891
false
false
2024-12-01
2025-01-13
0
LGAI-EXAONE/EXAONE-3.5-32B-Instruct
LGAI-EXAONE_EXAONE-3.5-7.8B-Instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
ExaoneForCausalLM
<a target="_blank" href="https://huggingface.co/LGAI-EXAONE/EXAONE-3.5-7.8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LGAI-EXAONE/EXAONE-3.5-7.8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LGAI-EXAONE__EXAONE-3.5-7.8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LGAI-EXAONE/EXAONE-3.5-7.8B-Instruct
0ff6b5ec7c13b049b253a16a889aa269e6b79a94
32.54723
other
112
7.818
true
false
false
true
1.439943
0.813605
81.360457
0.472759
25.653749
0.475076
47.507553
0.25755
1.006711
0.377938
4.942188
0.413314
34.812722
false
false
2024-12-01
2024-12-11
0
LGAI-EXAONE/EXAONE-3.5-7.8B-Instruct
LLM360_K2_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/LLM360/K2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LLM360/K2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LLM360__K2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LLM360/K2
49d159b6f2b64d562e745f0ff06e65b9a4c28ead
14.643753
apache-2.0
87
65.286
true
false
false
false
17.676413
0.225216
22.521576
0.497184
28.220403
0.02719
2.719033
0.276846
3.579418
0.398
8.55
0.300449
22.272089
false
true
2024-04-17
2024-06-26
0
LLM360/K2
LLM360_K2-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/LLM360/K2-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LLM360/K2-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LLM360__K2-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LLM360/K2-Chat
5454f2d28031c9127e4227c873ca2f154e02e4c7
24.387145
apache-2.0
36
65.286
true
false
false
true
34.519656
0.515176
51.51764
0.53581
33.793829
0.103474
10.347432
0.306208
7.494407
0.457
16.825
0.337101
26.344563
false
true
2024-05-22
2024-06-12
0
LLM360/K2-Chat
LLM4Binary_llm4decompile-1.3b-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/LLM4Binary/llm4decompile-1.3b-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LLM4Binary/llm4decompile-1.3b-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LLM4Binary__llm4decompile-1.3b-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LLM4Binary/llm4decompile-1.3b-v2
a347dabcb1ea9f21c9339bd764c150262e993b95
6.939025
mit
8
1.346
true
false
false
false
0.495165
0.226789
22.678936
0.327181
5.915475
0.01284
1.283988
0.235738
0
0.407177
9.430469
0.120928
2.325281
false
false
2024-06-18
2024-11-16
0
LLM4Binary/llm4decompile-1.3b-v2
Lambent_qwen2.5-reinstruct-alternate-lumen-14B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lambent/qwen2.5-reinstruct-alternate-lumen-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lambent/qwen2.5-reinstruct-alternate-lumen-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lambent__qwen2.5-reinstruct-alternate-lumen-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lambent/qwen2.5-reinstruct-alternate-lumen-14B
dac3be334098338fb6c02636349e8ed53f18c4a4
38.070618
3
14.766
false
false
false
false
4.529301
0.479381
47.938137
0.645899
48.989609
0.462236
46.223565
0.376678
16.89038
0.477
19.625
0.538813
48.757018
false
false
2024-09-23
2024-09-28
1
Lambent/qwen2.5-reinstruct-alternate-lumen-14B (Merge)
Langboat_Mengzi3-8B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Langboat/Mengzi3-8B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Langboat/Mengzi3-8B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Langboat__Mengzi3-8B-Chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Langboat/Mengzi3-8B-Chat
128fffd3dac7c6067ca4d1a650e836e3ef46c013
20.288293
apache-2.0
1
8.03
true
false
false
true
1.703871
0.513977
51.397736
0.468373
25.188298
0.090634
9.063444
0.274329
3.243848
0.407792
9.040625
0.314162
23.795804
false
false
2024-09-14
2024-10-21
0
Langboat/Mengzi3-8B-Chat
Lawnakk_BBA100_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lawnakk/BBA100" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lawnakk/BBA100</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lawnakk__BBA100-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lawnakk/BBA100
1f67fe78975a4f053e61106fa448055976151144
5.585008
0
7.616
false
false
false
false
0.725994
0.20758
20.758033
0.28257
2.168404
0.009819
0.981873
0.244128
0
0.401969
8.246094
0.112201
1.355644
false
false
2025-02-26
2025-02-26
1
Lawnakk/BBA100 (Merge)
Lawnakk_BBALAW1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lawnakk/BBALAW1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lawnakk/BBALAW1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lawnakk__BBALAW1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lawnakk/BBALAW1
f8c558ee7cf7033e738c71ae73cc8764bc9ec944
5.709611
0
7.616
false
false
false
false
0.67012
0.190544
19.054442
0.287237
2.532751
0.009819
0.981873
0.243289
0
0.415271
10.342188
0.112118
1.34641
false
false
2025-02-27
2025-02-27
1
Lawnakk/BBALAW1 (Merge)
Lawnakk_BBALAW1.0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lawnakk/BBALAW1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lawnakk/BBALAW1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lawnakk__BBALAW1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lawnakk/BBALAW1.0
e10c170571afb38d21b16ff4790eb10218f0fc07
3.25567
0
4.353
false
false
false
false
0.403299
0.135115
13.511483
0.282767
1.247638
0
0
0.255872
0.782998
0.352573
2.571615
0.112783
1.420287
false
false
2025-02-27
2025-02-27
1
Lawnakk/BBALAW1.0 (Merge)
Lawnakk_BBALAW1.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lawnakk/BBALAW1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lawnakk/BBALAW1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lawnakk__BBALAW1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lawnakk/BBALAW1.2
19994613ecdfb585d645fb7f12c481ffda6c2968
3.41748
0
4.353
false
false
false
false
0.386482
0.13544
13.543952
0.281127
1.316795
0
0
0.264262
1.901566
0.357906
2.571615
0.110539
1.170952
false
false
2025-02-27
2025-02-27
1
Lawnakk/BBALAW1.2 (Merge)
Lawnakk_BBALAW1.3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lawnakk/BBALAW1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lawnakk/BBALAW1.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lawnakk__BBALAW1.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lawnakk/BBALAW1.3
b9f66617f14946b773f4a55947bfc7dc2c1c184f
3.350236
0
4.353
false
false
false
false
0.373649
0.13544
13.543952
0.282698
1.223377
0
0
0.260906
1.454139
0.361906
2.838281
0.109375
1.041667
false
false
2025-02-27
2025-02-27
1
Lawnakk/BBALAW1.3 (Merge)
Lawnakk_BBALAW1.6_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lawnakk/BBALAW1.6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lawnakk/BBALAW1.6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lawnakk__BBALAW1.6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lawnakk/BBALAW1.6
d0a9e2e9caffb1f7b8d2c7f7bd3236cb9cc0afd9
31.048652
0
7.616
false
false
false
false
0.641741
0.524544
52.454377
0.555356
36.426502
0.360272
36.02719
0.323826
9.8434
0.436844
12.572135
0.450715
38.968307
false
false
2025-02-27
2025-02-27
1
Lawnakk/BBALAW1.6 (Merge)
Lawnakk_BBALAW1.61_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lawnakk/BBALAW1.61" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lawnakk/BBALAW1.61</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lawnakk__BBALAW1.61-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lawnakk/BBALAW1.61
5d868edc32bba0c4079156f2ecc5c0a511a3795d
31.79392
0
7.616
false
false
false
false
0.659426
0.577125
57.712536
0.554858
36.403567
0.366314
36.63142
0.317114
8.948546
0.43551
12.505469
0.447058
38.561983
false
false
2025-02-28
2025-02-28
1
Lawnakk/BBALAW1.61 (Merge)
Lawnakk_BBALAW1.62_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lawnakk/BBALAW1.62" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lawnakk/BBALAW1.62</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lawnakk__BBALAW1.62-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lawnakk/BBALAW1.62
9822fa5f241e48ed681d6ad87abac296c73fc28e
29.539931
0
7.616
false
false
false
false
0.613554
0.50461
50.460999
0.558052
37.104538
0.282477
28.247734
0.319631
9.284116
0.434333
12.758333
0.454455
39.383865
false
false
2025-02-28
2025-02-28
1
Lawnakk/BBALAW1.62 (Merge)
Lawnakk_BBALAW1.63_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lawnakk/BBALAW1.63" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lawnakk/BBALAW1.63</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lawnakk__BBALAW1.63-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lawnakk/BBALAW1.63
5e60f6aa903178906c57cacd0149db10bbda945f
29.373589
0
7.613
false
false
false
false
0.698909
0.440738
44.073835
0.554063
36.360915
0.370091
37.009063
0.312081
8.277405
0.430333
11.958333
0.447058
38.561983
false
false
2025-02-28
2025-02-28
1
Lawnakk/BBALAW1.63 (Merge)
Lawnakk_BBALAW1.64_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lawnakk/BBALAW1.64" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lawnakk/BBALAW1.64</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lawnakk__BBALAW1.64-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lawnakk/BBALAW1.64
7bd7f808a66dee4360f7c6a02ece7dc72127c1fc
3.181118
0
7.616
false
false
false
false
0.654014
0.139461
13.946107
0.277907
1.775501
0
0
0.248322
0
0.344667
2.083333
0.111536
1.281767
false
false
2025-02-28
2025-02-28
1
Lawnakk/BBALAW1.64 (Merge)
LenguajeNaturalAI_leniachat-gemma-2b-v0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/LenguajeNaturalAI/leniachat-gemma-2b-v0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LenguajeNaturalAI/leniachat-gemma-2b-v0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LenguajeNaturalAI__leniachat-gemma-2b-v0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LenguajeNaturalAI/leniachat-gemma-2b-v0
e5691dcc682a10dc9ef4bdbb3dc896fcf271018e
5.737241
apache-2.0
14
2.506
true
false
false
true
1.933155
0.214974
21.497405
0.307402
4.138297
0.011329
1.132931
0.26594
2.12528
0.365906
3.638281
0.117021
1.891253
false
false
2024-04-09
2024-09-01
1
google/gemma-2b
LenguajeNaturalAI_leniachat-qwen2-1.5B-v0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/LenguajeNaturalAI/leniachat-qwen2-1.5B-v0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LenguajeNaturalAI/leniachat-qwen2-1.5B-v0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LenguajeNaturalAI__leniachat-qwen2-1.5B-v0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LenguajeNaturalAI/leniachat-qwen2-1.5B-v0
031a2efebb3cc1150e46f42ba0bea9fa7b855436
8.580803
apache-2.0
19
1.543
true
false
false
true
1.689124
0.222118
22.211842
0.368356
12.771666
0.01284
1.283988
0.261745
1.565996
0.37499
3.873698
0.187999
9.77763
false
false
2024-06-16
2024-09-30
1
Qwen/Qwen2-1.5B
LeroyDyer_CheckPoint_A_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/LeroyDyer/CheckPoint_A" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">LeroyDyer/CheckPoint_A</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/LeroyDyer__CheckPoint_A-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
LeroyDyer/CheckPoint_A
508f1ec7ee33134a8c1de7774c08c1ce091466ed
18.929189
0
7.242
false
false
false
true
0.919346
0.451279
45.127927
0.47477
25.809387
0.058912
5.891239
0.283557
4.474273
0.423083
11.385417
0.287982
20.886894
false
false
2025-02-05
0
Removed