eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
64 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.09k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.22
0.83
BBH
float64
0.25
76.7
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.7
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
525 values
Submission Date
stringclasses
263 values
Generation
int64
0
10
Base Model
stringlengths
4
102
Xiaojian9992024_Qwen2.5-Dyanka-7B-Preview-v0.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Xiaojian9992024/Qwen2.5-Dyanka-7B-Preview-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Xiaojian9992024/Qwen2.5-Dyanka-7B-Preview-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Xiaojian9992024__Qwen2.5-Dyanka-7B-Preview-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Xiaojian9992024/Qwen2.5-Dyanka-7B-Preview-v0.2
5c38789333a433ee69fcbdd4a24a597ba0d0abb8
34.554535
0
7.616
false
false
false
true
0.652747
0.670198
67.019841
0.537439
34.359675
0.472054
47.205438
0.293624
5.816555
0.446708
15.471875
0.437084
37.453827
false
false
2025-02-26
2025-02-26
1
Xiaojian9992024/Qwen2.5-Dyanka-7B-Preview-v0.2 (Merge)
Xiaojian9992024_Qwen2.5-THREADRIPPER-Medium-Censored_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Xiaojian9992024/Qwen2.5-THREADRIPPER-Medium-Censored" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Xiaojian9992024/Qwen2.5-THREADRIPPER-Medium-Censored</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Xiaojian9992024__Qwen2.5-THREADRIPPER-Medium-Censored-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Xiaojian9992024/Qwen2.5-THREADRIPPER-Medium-Censored
269646a7eaedd39fe99a70d222186d901c3995e8
41.543818
1
14.77
false
false
false
true
1.59307
0.811206
81.120649
0.643145
49.112329
0.533988
53.398792
0.334732
11.297539
0.414
10.683333
0.492852
43.650266
false
false
2025-02-07
2025-02-07
1
Xiaojian9992024/Qwen2.5-THREADRIPPER-Medium-Censored (Merge)
Xiaojian9992024_Qwen2.5-THREADRIPPER-Small_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Xiaojian9992024/Qwen2.5-THREADRIPPER-Small" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Xiaojian9992024/Qwen2.5-THREADRIPPER-Small</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Xiaojian9992024__Qwen2.5-THREADRIPPER-Small-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Xiaojian9992024/Qwen2.5-THREADRIPPER-Small
987d437ec9bc9c7d12474cbac3663615d8f7dd79
36.554318
4
7.616
false
false
false
true
1.291255
0.768916
76.891647
0.548979
35.794684
0.473565
47.356495
0.310403
8.053691
0.434927
13.932552
0.435672
37.296838
false
false
2025-02-05
2025-02-05
1
Xiaojian9992024/Qwen2.5-THREADRIPPER-Small (Merge)
Xiaojian9992024_Qwen2.5-THREADRIPPER-Small-AnniversaryEdition_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Xiaojian9992024/Qwen2.5-THREADRIPPER-Small-AnniversaryEdition" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Xiaojian9992024/Qwen2.5-THREADRIPPER-Small-AnniversaryEdition</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Xiaojian9992024__Qwen2.5-THREADRIPPER-Small-AnniversaryEdition-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Xiaojian9992024/Qwen2.5-THREADRIPPER-Small-AnniversaryEdition
fc115c61ba4c92a64a1c79f5a55e86a464400122
34.250688
2
7.616
false
false
false
true
2.044542
0.74039
74.038994
0.546544
35.291929
0.507553
50.755287
0.268456
2.46085
0.380698
5.253906
0.439328
37.703162
false
false
2025-02-17
2025-02-19
1
Xiaojian9992024/Qwen2.5-THREADRIPPER-Small-AnniversaryEdition (Merge)
Xiaojian9992024_Qwen2.5-Ultra-1.5B-25.02-Exp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Xiaojian9992024/Qwen2.5-Ultra-1.5B-25.02-Exp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Xiaojian9992024/Qwen2.5-Ultra-1.5B-25.02-Exp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Xiaojian9992024__Qwen2.5-Ultra-1.5B-25.02-Exp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Xiaojian9992024/Qwen2.5-Ultra-1.5B-25.02-Exp
43b9236cd7578164a4767a9cfbf1d301c9b7240d
14.44657
2
1.544
false
false
false
true
0.600032
0.40734
40.73403
0.406558
17.02638
0.083082
8.308157
0.258389
1.118568
0.338313
1.255729
0.264129
18.236554
false
false
2025-02-20
2025-02-20
1
Xiaojian9992024/Qwen2.5-Ultra-1.5B-25.02-Exp (Merge)
Xiaojian9992024_Reflection-L3.2-JametMiniMix-3B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Xiaojian9992024/Reflection-L3.2-JametMiniMix-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Xiaojian9992024/Reflection-L3.2-JametMiniMix-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Xiaojian9992024__Reflection-L3.2-JametMiniMix-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Xiaojian9992024/Reflection-L3.2-JametMiniMix-3B
4d5f20dd1860f7e8538f51fc25f1854ab9c6a6fd
19.125923
1
3.213
false
false
false
false
0.585162
0.461945
46.194542
0.438953
20.235871
0.119335
11.933535
0.294463
5.928412
0.36674
8.375781
0.298787
22.087397
false
false
2025-02-17
2025-02-17
1
Xiaojian9992024/Reflection-L3.2-JametMiniMix-3B (Merge)
Xkev_Llama-3.2V-11B-cot_float16
float16
🌸 multimodal
🌸
Original
MllamaForConditionalGeneration
<a target="_blank" href="https://huggingface.co/Xkev/Llama-3.2V-11B-cot" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Xkev/Llama-3.2V-11B-cot</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Xkev__Llama-3.2V-11B-cot-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Xkev/Llama-3.2V-11B-cot
86d718ed524bf79320497bc2029e835af3b9bcc4
21.759029
apache-2.0
147
10.67
true
false
false
false
1.423221
0.415809
41.580894
0.495872
28.246762
0.155589
15.558912
0.295302
6.040268
0.415854
10.381771
0.35871
28.745567
false
false
2024-11-19
2024-12-26
1
Xkev/Llama-3.2V-11B-cot (Merge)
YOYO-AI_Qwen2.5-14B-1M-YOYO-V3_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/YOYO-AI/Qwen2.5-14B-1M-YOYO-V3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">YOYO-AI/Qwen2.5-14B-1M-YOYO-V3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/YOYO-AI__Qwen2.5-14B-1M-YOYO-V3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
YOYO-AI/Qwen2.5-14B-1M-YOYO-V3
bb54e01a0af3947328ce69470c4f0d73e9cb1ac0
42.559427
apache-2.0
4
14.766
true
false
false
true
1.780763
0.839833
83.983275
0.644849
49.46607
0.535498
53.549849
0.328859
10.514541
0.414125
11.098958
0.520695
46.743868
true
false
2025-02-21
2025-02-22
1
YOYO-AI/Qwen2.5-14B-1M-YOYO-V3 (Merge)
YOYO-AI_Qwen2.5-14B-YOYO-0505_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/YOYO-AI/Qwen2.5-14B-YOYO-0505" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">YOYO-AI/Qwen2.5-14B-YOYO-0505</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/YOYO-AI__Qwen2.5-14B-YOYO-0505-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
YOYO-AI/Qwen2.5-14B-YOYO-0505
962e4b7f8d2996f5be867a15f0bbe61cff0d2d98
39.665306
0
14.77
false
false
false
false
3.857234
0.588291
58.829129
0.653924
50.359073
0.443353
44.335347
0.373322
16.442953
0.475698
19.46224
0.537068
48.563091
false
false
2025-01-27
0
Removed
YOYO-AI_Qwen2.5-14B-YOYO-0510-v2_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/YOYO-AI/Qwen2.5-14B-YOYO-0510-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">YOYO-AI/Qwen2.5-14B-YOYO-0510-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/YOYO-AI__Qwen2.5-14B-YOYO-0510-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
YOYO-AI/Qwen2.5-14B-YOYO-0510-v2
127a1434acced2bc83614eaab86d6ca55c3fa30b
39.980946
apache-2.0
0
14.77
true
false
false
false
4.639096
0.594711
59.471092
0.655283
50.468801
0.444109
44.410876
0.381711
17.561521
0.474396
19.299479
0.538065
48.673907
true
false
2025-01-29
2025-01-29
1
YOYO-AI/Qwen2.5-14B-YOYO-0510-v2 (Merge)
YOYO-AI_Qwen2.5-14B-YOYO-0805_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/YOYO-AI/Qwen2.5-14B-YOYO-0805" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">YOYO-AI/Qwen2.5-14B-YOYO-0805</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/YOYO-AI__Qwen2.5-14B-YOYO-0805-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
YOYO-AI/Qwen2.5-14B-YOYO-0805
813e280b8acf5b96bdb02e4da6794d8351df3c02
39.665306
0
14.77
false
false
false
false
3.983397
0.588291
58.829129
0.653924
50.359073
0.443353
44.335347
0.373322
16.442953
0.475698
19.46224
0.537068
48.563091
false
false
2025-01-27
0
Removed
YOYO-AI_Qwen2.5-14B-YOYO-1005_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/YOYO-AI/Qwen2.5-14B-YOYO-1005" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">YOYO-AI/Qwen2.5-14B-YOYO-1005</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/YOYO-AI__Qwen2.5-14B-YOYO-1005-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
YOYO-AI/Qwen2.5-14B-YOYO-1005
9d644a3a31eb23e31a47b690ca13ee52fb4bad80
40.085904
apache-2.0
0
14.77
true
false
false
false
3.930809
0.597159
59.715887
0.654206
50.286899
0.452417
45.241692
0.380872
17.449664
0.473031
19.128906
0.538231
48.692376
true
false
2025-01-25
2025-01-26
1
YOYO-AI/Qwen2.5-14B-YOYO-1005 (Merge)
YOYO-AI_Qwen2.5-14B-YOYO-1005-v2_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/YOYO-AI/Qwen2.5-14B-YOYO-1005-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">YOYO-AI/Qwen2.5-14B-YOYO-1005-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/YOYO-AI__Qwen2.5-14B-YOYO-1005-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
YOYO-AI/Qwen2.5-14B-YOYO-1005-v2
05fcb89cd0d02c1a913478ead4df45a7ebdd7225
39.991724
apache-2.0
0
14.77
true
false
false
false
3.90949
0.59531
59.531044
0.655132
50.515055
0.443353
44.335347
0.384228
17.897092
0.473063
19.099479
0.537151
48.572326
true
false
2025-01-29
2025-01-29
1
YOYO-AI/Qwen2.5-14B-YOYO-1005-v2 (Merge)
YOYO-AI_Qwen2.5-14B-YOYO-1010_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/YOYO-AI/Qwen2.5-14B-YOYO-1010" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">YOYO-AI/Qwen2.5-14B-YOYO-1010</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/YOYO-AI__Qwen2.5-14B-YOYO-1010-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
YOYO-AI/Qwen2.5-14B-YOYO-1010
2d6d7fc307e2b5a570d45393a1511cbdb64e35e0
40.008009
apache-2.0
0
14.77
true
false
false
false
3.88634
0.589865
58.986489
0.653997
50.267719
0.450906
45.090634
0.383389
17.785235
0.474396
19.299479
0.537566
48.618499
true
false
2025-01-25
2025-01-25
1
YOYO-AI/Qwen2.5-14B-YOYO-1010 (Merge)
YOYO-AI_Qwen2.5-14B-YOYO-1010_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/YOYO-AI/Qwen2.5-14B-YOYO-1010" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">YOYO-AI/Qwen2.5-14B-YOYO-1010</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/YOYO-AI__Qwen2.5-14B-YOYO-1010-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
YOYO-AI/Qwen2.5-14B-YOYO-1010
b4a8e3712d7dbdfaaa473022ac4d831e41882ad6
31.959649
apache-2.0
0
14.77
true
false
false
true
1.670592
0.790474
79.047372
0.640599
48.690294
0
0
0.316275
8.836689
0.418063
11.357813
0.494432
43.825724
true
false
2025-01-25
2025-01-25
1
YOYO-AI/Qwen2.5-14B-YOYO-1010 (Merge)
YOYO-AI_Qwen2.5-14B-YOYO-1010-v2_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/YOYO-AI/Qwen2.5-14B-YOYO-1010-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">YOYO-AI/Qwen2.5-14B-YOYO-1010-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/YOYO-AI__Qwen2.5-14B-YOYO-1010-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
YOYO-AI/Qwen2.5-14B-YOYO-1010-v2
04dc9d0b3cea95118eac713d90112fe9b1f5c53f
39.980946
apache-2.0
0
14.77
true
false
false
false
3.919508
0.594711
59.471092
0.655283
50.468801
0.444109
44.410876
0.381711
17.561521
0.474396
19.299479
0.538065
48.673907
true
false
2025-01-28
2025-01-29
1
YOYO-AI/Qwen2.5-14B-YOYO-1010-v2 (Merge)
YOYO-AI_Qwen2.5-14B-YOYO-SCE_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/YOYO-AI/Qwen2.5-14B-YOYO-SCE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">YOYO-AI/Qwen2.5-14B-YOYO-SCE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/YOYO-AI__Qwen2.5-14B-YOYO-SCE-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
YOYO-AI/Qwen2.5-14B-YOYO-SCE
9c4749ee44179eca24a86dcc5fda790f72c13455
39.669217
apache-2.0
0
14.77
true
false
false
false
3.927547
0.584369
58.436947
0.648949
49.464883
0.46148
46.148036
0.374161
16.55481
0.470427
18.736719
0.538065
48.673907
true
false
2025-01-31
2025-01-31
1
YOYO-AI/Qwen2.5-14B-YOYO-SCE (Merge)
YOYO-AI_Qwen2.5-14B-YOYO-V4_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/YOYO-AI/Qwen2.5-14B-YOYO-V4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">YOYO-AI/Qwen2.5-14B-YOYO-V4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/YOYO-AI__Qwen2.5-14B-YOYO-V4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
YOYO-AI/Qwen2.5-14B-YOYO-V4
1befb006bc7c4732ec627b65cb069ad1a66648ed
42.285519
apache-2.0
4
14.766
true
false
false
true
3.693344
0.839783
83.978289
0.649035
49.672403
0.534743
53.47432
0.322148
9.619687
0.411521
10.640104
0.516955
46.32831
true
false
2025-03-02
2025-03-03
1
YOYO-AI/Qwen2.5-14B-YOYO-V4 (Merge)
YOYO-AI_Qwen2.5-14B-YOYO-V4-p1_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/YOYO-AI/Qwen2.5-14B-YOYO-V4-p1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">YOYO-AI/Qwen2.5-14B-YOYO-V4-p1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/YOYO-AI__Qwen2.5-14B-YOYO-V4-p1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
YOYO-AI/Qwen2.5-14B-YOYO-V4-p1
c5e0a02061f3857fcb2c93904548351766b3feaf
42.458285
apache-2.0
0
14.766
true
false
false
true
1.765685
0.820349
82.03489
0.651554
50.245421
0.533233
53.323263
0.345638
12.751678
0.419427
11.728385
0.501995
44.666076
true
false
2025-02-27
2025-02-28
1
YOYO-AI/Qwen2.5-14B-YOYO-V4-p1 (Merge)
YOYO-AI_Qwen2.5-14B-YOYO-V4-p2_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/YOYO-AI/Qwen2.5-14B-YOYO-V4-p2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">YOYO-AI/Qwen2.5-14B-YOYO-V4-p2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/YOYO-AI__Qwen2.5-14B-YOYO-V4-p2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
YOYO-AI/Qwen2.5-14B-YOYO-V4-p2
6a3445d37b1d99fbc8b26f4dd73af4182545810c
41.584665
apache-2.0
2
14.766
true
false
false
true
1.757041
0.804787
80.478685
0.633892
47.026926
0.516616
51.661631
0.327181
10.290828
0.443458
15.965625
0.496759
44.084294
true
false
2025-03-01
2025-03-01
1
YOYO-AI/Qwen2.5-14B-YOYO-V4-p2 (Merge)
YOYO-AI_Qwen2.5-14B-YOYO-latest_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/YOYO-AI/Qwen2.5-14B-YOYO-latest" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">YOYO-AI/Qwen2.5-14B-YOYO-latest</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/YOYO-AI__Qwen2.5-14B-YOYO-latest-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
YOYO-AI/Qwen2.5-14B-YOYO-latest
96d256afcba2e6616cecd1ca78bbb65f423ef1a4
40.078313
apache-2.0
0
14.77
true
false
false
false
3.953942
0.591064
59.106393
0.665623
52.035432
0.441843
44.18429
0.38255
17.673378
0.469125
18.907292
0.537068
48.563091
true
false
2025-02-02
2025-02-02
1
YOYO-AI/Qwen2.5-14B-YOYO-latest (Merge)
YOYO-AI_Qwen2.5-14B-YOYO-latest-V2_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/YOYO-AI/Qwen2.5-14B-YOYO-latest-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">YOYO-AI/Qwen2.5-14B-YOYO-latest-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/YOYO-AI__Qwen2.5-14B-YOYO-latest-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
YOYO-AI/Qwen2.5-14B-YOYO-latest-V2
2945b25444c537ed3b94bb088cdd035f78e5b676
41.845467
apache-2.0
0
14.766
true
false
false
true
3.697726
0.777135
77.713467
0.629902
47.298906
0.515861
51.586103
0.354027
13.870246
0.429938
13.675521
0.522357
46.928561
true
false
2025-02-17
2025-02-18
1
YOYO-AI/Qwen2.5-14B-YOYO-latest-V2 (Merge)
YOYO-AI_Qwen2.5-14B-it-restore_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/YOYO-AI/Qwen2.5-14B-it-restore" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">YOYO-AI/Qwen2.5-14B-it-restore</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/YOYO-AI__Qwen2.5-14B-it-restore-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
YOYO-AI/Qwen2.5-14B-it-restore
2da0ed7953502c4aa27a4fcb39af09b9e6614db9
41.502591
apache-2.0
0
14.766
true
false
false
true
1.670695
0.820948
82.094842
0.638773
48.425922
0.537009
53.700906
0.337248
11.63311
0.408729
9.824479
0.490027
43.336288
true
false
2025-03-09
2025-03-09
1
YOYO-AI/Qwen2.5-14B-it-restore (Merge)
YOYO-AI_Qwen2.5-7B-it-restore_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/YOYO-AI/Qwen2.5-7B-it-restore" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">YOYO-AI/Qwen2.5-7B-it-restore</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/YOYO-AI__Qwen2.5-7B-it-restore-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
YOYO-AI/Qwen2.5-7B-it-restore
c45b9a94190dfdbc24c2e3fb183072738a8a5d97
35.333303
apache-2.0
0
7.613
true
false
false
true
0.668873
0.75308
75.307961
0.540652
35.084322
0.5
50
0.301174
6.823266
0.400698
8.253906
0.428773
36.530363
true
false
2025-03-10
2025-03-10
1
YOYO-AI/Qwen2.5-7B-it-restore (Merge)
YOYO-AI_Qwen2.5-Coder-14B-YOYO-1010_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/YOYO-AI/Qwen2.5-Coder-14B-YOYO-1010" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">YOYO-AI/Qwen2.5-Coder-14B-YOYO-1010</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/YOYO-AI__Qwen2.5-Coder-14B-YOYO-1010-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
YOYO-AI/Qwen2.5-Coder-14B-YOYO-1010
45c2fd877aefbdd560eadf12cd0a766029b1f782
32.05464
apache-2.0
0
14.77
true
false
false
false
3.230075
0.533586
53.358644
0.618666
45.20119
0.321752
32.175227
0.352349
13.646532
0.44224
13.779948
0.407497
34.166297
true
false
2025-01-27
2025-01-28
1
YOYO-AI/Qwen2.5-Coder-14B-YOYO-1010 (Merge)
YOYO-AI_ZYH-LLM-Qwen2.5-14B_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/YOYO-AI/ZYH-LLM-Qwen2.5-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">YOYO-AI/ZYH-LLM-Qwen2.5-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/YOYO-AI__ZYH-LLM-Qwen2.5-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
YOYO-AI/ZYH-LLM-Qwen2.5-14B
93b4ec952f28fd349e37674a41fa111f5adc6a18
39.823589
apache-2.0
2
14.77
true
false
false
false
3.941876
0.594111
59.41114
0.664446
52.042745
0.411631
41.163142
0.385906
18.120805
0.475698
19.86224
0.535073
48.34146
true
false
2025-02-05
2025-02-05
1
YOYO-AI/ZYH-LLM-Qwen2.5-14B (Merge)
YOYO-AI_ZYH-LLM-Qwen2.5-14B-V2_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/YOYO-AI/ZYH-LLM-Qwen2.5-14B-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">YOYO-AI/ZYH-LLM-Qwen2.5-14B-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/YOYO-AI__ZYH-LLM-Qwen2.5-14B-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
YOYO-AI/ZYH-LLM-Qwen2.5-14B-V2
fc0ca4d32fd37afc46adb7a77021c1027f2aacda
36.566362
apache-2.0
1
14.766
true
false
false
false
2.015112
0.507083
50.708343
0.645208
49.088647
0.35423
35.422961
0.379195
17.225951
0.468906
18.379948
0.537151
48.572326
true
false
2025-02-08
2025-02-08
1
YOYO-AI/ZYH-LLM-Qwen2.5-14B-V2 (Merge)
YOYO-AI_ZYH-LLM-Qwen2.5-14B-V3_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/YOYO-AI/ZYH-LLM-Qwen2.5-14B-V3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">YOYO-AI/ZYH-LLM-Qwen2.5-14B-V3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/YOYO-AI__ZYH-LLM-Qwen2.5-14B-V3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
YOYO-AI/ZYH-LLM-Qwen2.5-14B-V3
3458f675f43efdd59b705f5dd9baea091bbea27c
41.628252
apache-2.0
6
14.766
true
false
false
true
1.863211
0.857793
85.779288
0.635925
48.182465
0.52719
52.719033
0.332215
10.961969
0.402156
9.002865
0.488115
43.123892
true
false
2025-02-23
2025-02-24
1
YOYO-AI/ZYH-LLM-Qwen2.5-14B-V3 (Merge)
YOYO-AI_ZYH-LLM-Qwen2.5-14B-V4_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/YOYO-AI/ZYH-LLM-Qwen2.5-14B-V4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">YOYO-AI/ZYH-LLM-Qwen2.5-14B-V4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/YOYO-AI__ZYH-LLM-Qwen2.5-14B-V4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
YOYO-AI/ZYH-LLM-Qwen2.5-14B-V4
eb2fd7148b95fb8dc30a8d8bdda5f55bf0d4ec94
43.137421
apache-2.0
4
14.766
true
false
false
true
1.801893
0.836461
83.646059
0.651497
50.269353
0.539275
53.927492
0.314597
8.612975
0.443427
15.661719
0.520362
46.70693
true
false
2025-03-12
2025-03-13
1
YOYO-AI/ZYH-LLM-Qwen2.5-14B-V4 (Merge)
Yash21_TinyYi-7B-Test_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Yash21/TinyYi-7B-Test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Yash21/TinyYi-7B-Test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Yash21__TinyYi-7B-Test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Yash21/TinyYi-7B-Test
7750e5de73fbcf1dcc0832b4cdabaa9713c20475
4.495167
apache-2.0
0
6.061
true
false
false
false
1.526218
0.185649
18.564852
0.29098
2.267966
0
0
0.264262
1.901566
0.336448
3.222656
0.109126
1.013963
true
false
2024-01-06
2024-07-03
0
Yash21/TinyYi-7B-Test
Youlln_1PARAMMYL-8B-ModelStock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Youlln/1PARAMMYL-8B-ModelStock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/1PARAMMYL-8B-ModelStock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__1PARAMMYL-8B-ModelStock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Youlln/1PARAMMYL-8B-ModelStock
4ce556da5ccd1ecac8d0f3e1e94d1982f11b910d
26.309152
0
8.03
false
false
false
false
1.785044
0.537134
53.713369
0.521584
31.799951
0.148792
14.879154
0.323826
9.8434
0.440938
14.283854
0.400017
33.33518
false
false
2024-09-20
2024-09-20
1
Youlln/1PARAMMYL-8B-ModelStock (Merge)
Youlln_2PRYMMAL-Yi1.5-6B-SLERP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Youlln/2PRYMMAL-Yi1.5-6B-SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/2PRYMMAL-Yi1.5-6B-SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__2PRYMMAL-Yi1.5-6B-SLERP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Youlln/2PRYMMAL-Yi1.5-6B-SLERP
b776bd3ce6784b96ff928b1d5ad51b2991909f2c
18.991811
apache-2.0
0
6.061
true
false
false
false
2.078897
0.282594
28.259352
0.466475
24.495644
0.113293
11.329305
0.307047
7.606264
0.475604
18.150521
0.316988
24.109781
true
false
2024-09-22
2024-09-23
1
Youlln/2PRYMMAL-Yi1.5-6B-SLERP (Merge)
Youlln_3PRYMMAL-PHI3-3B-SLERP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/Youlln/3PRYMMAL-PHI3-3B-SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/3PRYMMAL-PHI3-3B-SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__3PRYMMAL-PHI3-3B-SLERP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Youlln/3PRYMMAL-PHI3-3B-SLERP
9396bcf1709ac8360a95a746482520fab4295706
25.138741
apache-2.0
0
3
true
false
false
false
2.156878
0.36555
36.555007
0.542183
35.827668
0.17145
17.145015
0.326342
10.178971
0.464844
17.772135
0.400183
33.35365
true
false
2024-09-23
2024-09-23
1
Youlln/3PRYMMAL-PHI3-3B-SLERP (Merge)
Youlln_4PRYMMAL-GEMMA2-9B-SLERP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/Youlln/4PRYMMAL-GEMMA2-9B-SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/4PRYMMAL-GEMMA2-9B-SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__4PRYMMAL-GEMMA2-9B-SLERP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Youlln/4PRYMMAL-GEMMA2-9B-SLERP
7dac3b4ab4298113ae3103d63bb284e1ac8bf4d4
23.688708
apache-2.0
1
9.242
true
false
false
false
5.70532
0.271377
27.137661
0.592253
42.064172
0.090634
9.063444
0.330537
10.738255
0.467198
17.466406
0.420961
35.662308
true
false
2024-09-23
2024-09-23
1
Youlln/4PRYMMAL-GEMMA2-9B-SLERP (Merge)
Youlln_ECE-MIRAGE-1-12B_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Youlln/ECE-MIRAGE-1-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE-MIRAGE-1-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE-MIRAGE-1-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Youlln/ECE-MIRAGE-1-12B
c9ecb705a9d39be9250d5372a7711e491f6e2154
4.78572
apache-2.0
1
15.21
true
false
false
false
1.267545
0.206981
20.698081
0.301071
2.600553
0
0
0.263423
1.789709
0.321938
2.408854
0.110954
1.217125
false
false
2025-02-10
2025-02-11
0
Youlln/ECE-MIRAGE-1-12B
Youlln_ECE-MIRAGE-1-15B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Youlln/ECE-MIRAGE-1-15B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE-MIRAGE-1-15B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE-MIRAGE-1-15B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Youlln/ECE-MIRAGE-1-15B
c9ecb705a9d39be9250d5372a7711e491f6e2154
4.78572
apache-2.0
1
15.21
true
false
false
false
1.288163
0.206981
20.698081
0.301071
2.600553
0
0
0.263423
1.789709
0.321938
2.408854
0.110954
1.217125
false
false
2025-02-10
2025-02-21
0
Youlln/ECE-MIRAGE-1-15B
Youlln_ECE-PRYMMAL-0.5B-FT-V3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Youlln/ECE-PRYMMAL-0.5B-FT-V3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE-PRYMMAL-0.5B-FT-V3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE-PRYMMAL-0.5B-FT-V3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Youlln/ECE-PRYMMAL-0.5B-FT-V3
d542b4d53888fcc8e96c32892d47ec51afc9edc9
4.392856
apache-2.0
0
0.494
true
false
false
false
1.14627
0.164191
16.419101
0.309313
3.616883
0.003021
0.302115
0.25755
1.006711
0.364448
3.222656
0.116107
1.789672
false
false
2024-10-16
2024-10-16
1
Youlln/ECE-PRYMMAL-0.5B-FT-V3 (Merge)
Youlln_ECE-PRYMMAL-0.5B-FT-V3-MUSR_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Youlln/ECE-PRYMMAL-0.5B-FT-V3-MUSR" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE-PRYMMAL-0.5B-FT-V3-MUSR</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE-PRYMMAL-0.5B-FT-V3-MUSR-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Youlln/ECE-PRYMMAL-0.5B-FT-V3-MUSR
221dc80a1acd6f7dda0644699e6d61b90a5a0a05
5.538703
apache-2.0
0
0.494
true
false
false
false
2.059969
0.15335
15.334978
0.304115
5.062186
0.024169
2.416918
0.249161
0
0.366031
3.253906
0.164478
7.164229
false
false
2024-10-21
2024-10-21
1
Youlln/ECE-PRYMMAL-0.5B-FT-V3-MUSR (Merge)
Youlln_ECE-PRYMMAL-0.5B-FT-V4-MUSR_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Youlln/ECE-PRYMMAL-0.5B-FT-V4-MUSR" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE-PRYMMAL-0.5B-FT-V4-MUSR</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE-PRYMMAL-0.5B-FT-V4-MUSR-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Youlln/ECE-PRYMMAL-0.5B-FT-V4-MUSR
f5b268d63bb10f05a229da4f2ee9cb0882c93971
4.211187
apache-2.0
0
0.494
true
false
false
false
1.897383
0.113757
11.375705
0.303836
4.949092
0.012085
1.208459
0.270134
2.684564
0.352885
1.477344
0.132148
3.571956
false
false
2024-10-21
2024-10-21
1
Youlln/ECE-PRYMMAL-0.5B-FT-V4-MUSR (Merge)
Youlln_ECE-PRYMMAL-0.5B-SLERP-V2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Youlln/ECE-PRYMMAL-0.5B-SLERP-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE-PRYMMAL-0.5B-SLERP-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE-PRYMMAL-0.5B-SLERP-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Youlln/ECE-PRYMMAL-0.5B-SLERP-V2
5e87669abcdc042774a63b94a13880f1acd6e15d
4.627195
apache-2.0
0
0.494
true
false
false
false
1.309995
0.161193
16.119341
0.293477
1.917561
0.000755
0.075529
0.274329
3.243848
0.383115
5.35599
0.109458
1.050901
false
false
2024-10-22
2024-10-22
1
Youlln/ECE-PRYMMAL-0.5B-SLERP-V2 (Merge)
Youlln_ECE-PRYMMAL-0.5B-SLERP-V3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Youlln/ECE-PRYMMAL-0.5B-SLERP-V3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE-PRYMMAL-0.5B-SLERP-V3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE-PRYMMAL-0.5B-SLERP-V3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Youlln/ECE-PRYMMAL-0.5B-SLERP-V3
94bfab3b1f41458427e5f8598ceb3ec731ba1bd6
3.663014
apache-2.0
0
0.494
true
false
false
false
1.275397
0.167014
16.701352
0.293838
2.319605
0
0
0.251678
0.223714
0.354125
1.765625
0.10871
0.96779
false
false
2024-10-22
2024-10-22
0
Youlln/ECE-PRYMMAL-0.5B-SLERP-V3
Youlln_ECE-PRYMMAL-YL-1B-SLERP-V1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Youlln/ECE-PRYMMAL-YL-1B-SLERP-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE-PRYMMAL-YL-1B-SLERP-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE-PRYMMAL-YL-1B-SLERP-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Youlln/ECE-PRYMMAL-YL-1B-SLERP-V1
b5cd268edb0cc5c2c6ab2c49c950e611b2b8138c
16.681936
apache-2.0
0
1.544
true
false
false
false
1.190682
0.325108
32.510849
0.420851
18.279511
0.107251
10.725076
0.291107
5.480984
0.426583
11.589583
0.293551
21.505615
false
false
2024-11-08
2024-11-08
0
Youlln/ECE-PRYMMAL-YL-1B-SLERP-V1
Youlln_ECE-PRYMMAL-YL-1B-SLERP-V2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Youlln/ECE-PRYMMAL-YL-1B-SLERP-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE-PRYMMAL-YL-1B-SLERP-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE-PRYMMAL-YL-1B-SLERP-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Youlln/ECE-PRYMMAL-YL-1B-SLERP-V2
3559f643c8d5774135a1cd8daea78fef31035679
16.681936
apache-2.0
0
1.544
true
false
false
false
1.209256
0.325108
32.510849
0.420851
18.279511
0.107251
10.725076
0.291107
5.480984
0.426583
11.589583
0.293551
21.505615
false
false
2024-11-08
2024-11-08
0
Youlln/ECE-PRYMMAL-YL-1B-SLERP-V2
Youlln_ECE-PRYMMAL-YL-7B-SLERP-V4_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Youlln/ECE-PRYMMAL-YL-7B-SLERP-V4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE-PRYMMAL-YL-7B-SLERP-V4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE-PRYMMAL-YL-7B-SLERP-V4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Youlln/ECE-PRYMMAL-YL-7B-SLERP-V4
4939b9e24be6f03d5df1e9bb7dc1b4fd5d59404a
10.869547
apache-2.0
0
7.616
true
false
false
false
1.540314
0.25097
25.096965
0.376973
13.157437
0.053625
5.362538
0.265101
2.013423
0.37449
7.011198
0.213182
12.575724
false
false
2024-11-06
2024-11-06
0
Youlln/ECE-PRYMMAL-YL-7B-SLERP-V4
Youlln_ECE-PRYMMAL0.5-FT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Youlln/ECE-PRYMMAL0.5-FT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE-PRYMMAL0.5-FT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE-PRYMMAL0.5-FT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Youlln/ECE-PRYMMAL0.5-FT
56b9fd5f26e5b6379fe4aa62e0f66b87b5c6f8e8
5.585742
apache-2.0
0
0.494
true
false
false
false
1.006783
0.185073
18.507338
0.313209
5.1516
0.023414
2.34139
0.255872
0.782998
0.330125
1.432292
0.147689
5.298833
false
false
2024-10-02
2024-10-02
1
Youlln/ECE-PRYMMAL0.5-FT (Merge)
Youlln_ECE-PRYMMAL0.5B-Youri_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Youlln/ECE-PRYMMAL0.5B-Youri" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE-PRYMMAL0.5B-Youri</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE-PRYMMAL0.5B-Youri-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Youlln/ECE-PRYMMAL0.5B-Youri
1477d3deff98f35f523aa222bc0442278d464566
3.505274
1
0.63
false
false
false
false
1.311692
0.144632
14.46318
0.281736
1.501296
0
0
0.243289
0
0.369656
4.007031
0.109541
1.060136
false
false
2024-10-07
2024-10-07
1
Youlln/ECE-PRYMMAL0.5B-Youri (Merge)
Youlln_ECE-PRYMMAL1B-FT-V1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Youlln/ECE-PRYMMAL1B-FT-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE-PRYMMAL1B-FT-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE-PRYMMAL1B-FT-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Youlln/ECE-PRYMMAL1B-FT-V1
d0fc3a6e93f91c8d586eb25c9f2a4ea4ca99e9f4
11.847798
apache-2.0
0
1.544
true
false
false
false
1.475191
0.214375
21.437453
0.403265
16.189386
0.064199
6.41994
0.278523
3.803132
0.341656
3.873698
0.274269
19.36318
false
false
2024-10-12
2024-10-12
1
Youlln/ECE-PRYMMAL1B-FT-V1 (Merge)
Youlln_ECE-Qwen0.5B-FT-V2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Youlln/ECE-Qwen0.5B-FT-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE-Qwen0.5B-FT-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE-Qwen0.5B-FT-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Youlln/ECE-Qwen0.5B-FT-V2
c87da3f19ab74854fca30f9ca71ce5c4884ef629
7.574687
apache-2.0
0
0.494
true
false
false
false
1.046599
0.252593
25.259312
0.328971
7.632148
0.020393
2.039275
0.266779
2.237136
0.306281
0.885156
0.166556
7.395095
false
false
2024-10-11
2024-10-11
1
Youlln/ECE-Qwen0.5B-FT-V2 (Merge)
Youlln_ECE.EIFFEIL.ia-0.5B-SLERP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Youlln/ECE.EIFFEIL.ia-0.5B-SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Youlln/ECE.EIFFEIL.ia-0.5B-SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Youlln__ECE.EIFFEIL.ia-0.5B-SLERP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Youlln/ECE.EIFFEIL.ia-0.5B-SLERP
e376ce416af881eefa778d2566d15d9a6d29e7d9
8.829966
apache-2.0
0
0.63
true
false
false
false
1.208119
0.25614
25.614037
0.330567
8.405356
0.059668
5.966767
0.265101
2.013423
0.310219
0.94401
0.190326
10.0362
true
false
2024-10-14
2024-10-14
1
Youlln/ECE.EIFFEIL.ia-0.5B-SLERP (Merge)
YoungPanda_qwenqwen_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2MoeForCausalLM
<a target="_blank" href="https://huggingface.co/YoungPanda/qwenqwen" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">YoungPanda/qwenqwen</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/YoungPanda__qwenqwen-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
YoungPanda/qwenqwen
3b5d9b63076acc8988b8f7e9734cf1d78bb39c25
4.783628
0
14.316
false
false
false
true
14.245338
0.126397
12.639685
0.337899
8.19478
0.035498
3.549849
0.25
0
0.343365
2.453906
0.116772
1.863549
false
false
2024-09-12
0
Removed
Yuma42_KangalKhan-RawRuby-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Yuma42/KangalKhan-RawRuby-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Yuma42/KangalKhan-RawRuby-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Yuma42__KangalKhan-RawRuby-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Yuma42/KangalKhan-RawRuby-7B
54f56d4c6889eaf43fdd5f7d6dcef3c2ebe51929
20.49109
apache-2.0
7
7.242
true
false
false
true
1.307439
0.547675
54.767461
0.475473
26.387284
0.066465
6.646526
0.287752
5.033557
0.394958
7.636458
0.302277
22.475251
true
false
2024-02-17
2024-06-26
1
Yuma42/KangalKhan-RawRuby-7B (Merge)
Yuma42_Llama3.1-IgneousIguana-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Yuma42/Llama3.1-IgneousIguana-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Yuma42/Llama3.1-IgneousIguana-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Yuma42__Llama3.1-IgneousIguana-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Yuma42/Llama3.1-IgneousIguana-8B
b3d20a0fdd9002cc39b921363ec873475f68874f
31.476167
llama3.1
2
8.03
true
false
false
true
0.707055
0.81333
81.332974
0.519051
31.985927
0.219789
21.978852
0.310403
8.053691
0.42026
12.465885
0.397357
33.039672
true
false
2025-03-10
2025-03-10
1
Yuma42/Llama3.1-IgneousIguana-8B (Merge)
Yuma42_Llama3.1-SuperHawk-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Yuma42/Llama3.1-SuperHawk-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Yuma42/Llama3.1-SuperHawk-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Yuma42__Llama3.1-SuperHawk-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Yuma42/Llama3.1-SuperHawk-8B
4c9dfbc9b8b7bd98ca5ec288f4aab0cf85b2ccb5
31.135471
llama3.1
1
8.03
true
false
false
true
0.712008
0.798642
79.864205
0.519993
31.966635
0.234894
23.489426
0.312919
8.389262
0.408354
10.377604
0.394531
32.725694
true
false
2025-03-10
2025-03-10
1
Yuma42/Llama3.1-SuperHawk-8B (Merge)
Z1-Coder_Z1-Coder-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Z1-Coder/Z1-Coder-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Z1-Coder/Z1-Coder-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Z1-Coder__Z1-Coder-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Z1-Coder/Z1-Coder-7B
fb3866e2735424e6e133a1ca955dcdec6b577908
21.533349
mit
0
7.613
true
false
false
true
1.170037
0.321511
32.151137
0.484183
28.158145
0.324773
32.477341
0.272651
3.020134
0.362156
2.736198
0.375914
30.657137
false
false
2025-01-18
2025-03-01
0
Z1-Coder/Z1-Coder-7B
ZHLiu627_zephyr-7b-gemma-dpo-avg_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/ZHLiu627/zephyr-7b-gemma-dpo-avg" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ZHLiu627/zephyr-7b-gemma-dpo-avg</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ZHLiu627__zephyr-7b-gemma-dpo-avg-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ZHLiu627/zephyr-7b-gemma-dpo-avg
b97a56a94799ff084fde17bb8e65f3d1d80ecfe3
14.667993
0
8.538
false
false
false
false
0.887614
0.308997
30.89968
0.414882
18.404535
0.045317
4.531722
0.278523
3.803132
0.410708
9.805208
0.285073
20.563682
false
false
2025-02-25
0
Removed
ZHLiu627_zephyr-7b-gemma-rpo-avg_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/ZHLiu627/zephyr-7b-gemma-rpo-avg" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ZHLiu627/zephyr-7b-gemma-rpo-avg</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ZHLiu627__zephyr-7b-gemma-rpo-avg-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ZHLiu627/zephyr-7b-gemma-rpo-avg
d66c14a077fd7aa29dabaf3fb19e0f3630fb3646
14.588312
apache-2.0
0
8.538
true
false
false
false
0.87133
0.300604
30.060351
0.418328
19.016801
0.049849
4.984894
0.276846
3.579418
0.408104
9.546354
0.283078
20.342051
false
false
2025-02-23
2025-02-25
0
ZHLiu627/zephyr-7b-gemma-rpo-avg
ZeroXClem_L3-Aspire-Heart-Matrix-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ZeroXClem/L3-Aspire-Heart-Matrix-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ZeroXClem/L3-Aspire-Heart-Matrix-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ZeroXClem__L3-Aspire-Heart-Matrix-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ZeroXClem/L3-Aspire-Heart-Matrix-8B
d63917595e911b077cff38109c74622c3ec41704
25.815224
apache-2.0
4
8.03
true
false
false
true
1.575403
0.483353
48.335306
0.538421
34.307547
0.182779
18.277946
0.324664
9.955257
0.418708
13.071875
0.378491
30.94341
true
false
2024-11-21
2024-11-22
1
ZeroXClem/L3-Aspire-Heart-Matrix-8B (Merge)
ZeroXClem_Llama-3.1-8B-AthenaSky-MegaMix_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ZeroXClem/Llama-3.1-8B-AthenaSky-MegaMix" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ZeroXClem/Llama-3.1-8B-AthenaSky-MegaMix</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ZeroXClem__Llama-3.1-8B-AthenaSky-MegaMix-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ZeroXClem/Llama-3.1-8B-AthenaSky-MegaMix
b67376478ad1f6e1355e9c42aea4072561fccc82
26.791595
apache-2.0
2
8.03
true
false
false
true
0.738433
0.630082
63.008152
0.516342
31.385284
0.279456
27.945619
0.277685
3.691275
0.353844
6.897135
0.350399
27.822104
true
false
2025-03-11
2025-03-11
1
ZeroXClem/Llama-3.1-8B-AthenaSky-MegaMix (Merge)
ZeroXClem_Llama-3.1-8B-RainbowLight-EtherealMix_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ZeroXClem/Llama-3.1-8B-RainbowLight-EtherealMix" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ZeroXClem/Llama-3.1-8B-RainbowLight-EtherealMix</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ZeroXClem__Llama-3.1-8B-RainbowLight-EtherealMix-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ZeroXClem/Llama-3.1-8B-RainbowLight-EtherealMix
f3dacbc5d69989216ebe5c30092b25cb40f93cd9
22.83098
apache-2.0
2
8.03
true
false
false
true
0.763002
0.497341
49.73415
0.515479
31.072264
0.121601
12.160121
0.286913
4.9217
0.394708
9.871875
0.363032
29.225768
true
false
2025-03-10
2025-03-11
1
ZeroXClem/Llama-3.1-8B-RainbowLight-EtherealMix (Merge)
ZeroXClem_Llama-3.1-8B-SpecialTitanFusion_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ZeroXClem/Llama-3.1-8B-SpecialTitanFusion" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ZeroXClem/Llama-3.1-8B-SpecialTitanFusion</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ZeroXClem__Llama-3.1-8B-SpecialTitanFusion-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ZeroXClem/Llama-3.1-8B-SpecialTitanFusion
c5c0f936a7e182989dd035a846b08bfd578da6e5
29.233404
apache-2.0
2
8.03
true
false
false
true
0.68587
0.74024
74.024034
0.543893
34.823136
0.233384
23.338369
0.299497
6.599553
0.387396
7.491146
0.362118
29.124187
true
false
2025-03-11
2025-03-11
1
ZeroXClem/Llama-3.1-8B-SpecialTitanFusion (Merge)
ZeroXClem_Llama-3.1-8B-SuperNova-EtherealHermes_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ZeroXClem/Llama-3.1-8B-SuperNova-EtherealHermes" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ZeroXClem/Llama-3.1-8B-SuperNova-EtherealHermes</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ZeroXClem__Llama-3.1-8B-SuperNova-EtherealHermes-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ZeroXClem/Llama-3.1-8B-SuperNova-EtherealHermes
3512e8f5998996db8375142dc80df77a93cd7dc6
28.40554
apache-2.0
3
8.03
true
false
false
true
0.737191
0.733871
73.387057
0.524446
32.071289
0.174471
17.44713
0.292785
5.704698
0.406583
11.322917
0.374501
30.500148
true
false
2025-03-10
2025-03-11
1
ZeroXClem/Llama-3.1-8B-SuperNova-EtherealHermes (Merge)
ZeroXClem_Llama-3.1-8B-SuperTulu-LexiNova_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ZeroXClem/Llama-3.1-8B-SuperTulu-LexiNova" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ZeroXClem/Llama-3.1-8B-SuperTulu-LexiNova</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ZeroXClem__Llama-3.1-8B-SuperTulu-LexiNova-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ZeroXClem/Llama-3.1-8B-SuperTulu-LexiNova
c25f5786954e523b1b0a2376db9abe1fd22313d1
23.300171
apache-2.0
2
8.03
true
false
false
false
1.46825
0.416458
41.645833
0.50786
30.502797
0.253021
25.302115
0.286074
4.809843
0.397062
11.232812
0.336769
26.307624
true
false
2025-03-10
2025-03-10
1
ZeroXClem/Llama-3.1-8B-SuperTulu-LexiNova (Merge)
ZeroXClem_Qwen-2.5-Aether-SlerpFusion-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ZeroXClem/Qwen-2.5-Aether-SlerpFusion-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ZeroXClem/Qwen-2.5-Aether-SlerpFusion-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ZeroXClem__Qwen-2.5-Aether-SlerpFusion-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ZeroXClem/Qwen-2.5-Aether-SlerpFusion-7B
23992e1be9f77d767181dc7bcb42176395f42c30
30.11833
apache-2.0
2
7.616
true
false
false
true
1.352707
0.62616
62.61597
0.546224
36.011209
0.273414
27.34139
0.298658
6.487696
0.417781
11.289323
0.43268
36.964391
true
false
2024-11-13
2024-11-20
1
ZeroXClem/Qwen-2.5-Aether-SlerpFusion-7B (Merge)
ZeroXClem_Qwen2.5-7B-CelestialHarmony-1M_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ZeroXClem/Qwen2.5-7B-CelestialHarmony-1M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ZeroXClem/Qwen2.5-7B-CelestialHarmony-1M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ZeroXClem__Qwen2.5-7B-CelestialHarmony-1M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ZeroXClem/Qwen2.5-7B-CelestialHarmony-1M
36dbdb9dec7d9d9dc49c132167121cce45d71394
32.038916
mit
7
7.613
true
false
false
false
1.431679
0.594386
59.438623
0.543137
34.507416
0.347432
34.743202
0.318792
9.17226
0.459542
16.742708
0.438664
37.629285
true
false
2025-01-30
2025-02-05
1
ZeroXClem/Qwen2.5-7B-CelestialHarmony-1M (Merge)
ZeroXClem_Qwen2.5-7B-HomerAnvita-NerdMix_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ZeroXClem/Qwen2.5-7B-HomerAnvita-NerdMix" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ZeroXClem/Qwen2.5-7B-HomerAnvita-NerdMix</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ZeroXClem__Qwen2.5-7B-HomerAnvita-NerdMix-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ZeroXClem/Qwen2.5-7B-HomerAnvita-NerdMix
cd87a9d7c9a9c8950af84e1f4c72fff5d4625d8a
35.641865
apache-2.0
4
7.616
true
false
false
true
1.568386
0.770765
77.07649
0.554132
36.579206
0.383686
38.36858
0.319631
9.284116
0.439052
14.414844
0.443152
38.127955
true
false
2024-11-21
2024-11-21
1
ZeroXClem/Qwen2.5-7B-HomerAnvita-NerdMix (Merge)
ZeroXClem_Qwen2.5-7B-HomerCreative-Mix_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ZeroXClem/Qwen2.5-7B-HomerCreative-Mix" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ZeroXClem/Qwen2.5-7B-HomerCreative-Mix</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ZeroXClem__Qwen2.5-7B-HomerCreative-Mix-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ZeroXClem/Qwen2.5-7B-HomerCreative-Mix
6849553db73428ca67823a06f5cfeea660f77df8
34.907245
apache-2.0
10
7.616
true
false
false
true
1.459141
0.783504
78.350443
0.554807
36.770722
0.356495
35.649547
0.299497
6.599553
0.434958
13.769792
0.444731
38.303413
true
false
2024-11-21
2024-11-21
1
ZeroXClem/Qwen2.5-7B-HomerCreative-Mix (Merge)
ZeroXClem_Qwen2.5-7B-Qandora-CySec_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ZeroXClem/Qwen2.5-7B-Qandora-CySec" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ZeroXClem/Qwen2.5-7B-Qandora-CySec</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ZeroXClem__Qwen2.5-7B-Qandora-CySec-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ZeroXClem/Qwen2.5-7B-Qandora-CySec
6c8b513dbc61a9f704210d26124244f19f3bc4cc
32.0235
apache-2.0
4
7.616
true
false
false
true
1.364161
0.677317
67.73173
0.549002
36.264898
0.293051
29.305136
0.300336
6.711409
0.428604
13.408854
0.448471
38.718972
true
false
2024-11-12
2024-11-12
1
ZeroXClem/Qwen2.5-7B-Qandora-CySec (Merge)
ZeusLabs_L3-Aethora-15B-V2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ZeusLabs/L3-Aethora-15B-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ZeusLabs/L3-Aethora-15B-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ZeusLabs__L3-Aethora-15B-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ZeusLabs/L3-Aethora-15B-V2
2c601f116c37dd912c89357dbdbef879a637997e
24.698714
cc-by-sa-4.0
41
15.01
true
false
false
true
4.755337
0.720806
72.080635
0.501091
28.968505
0.080816
8.081571
0.287752
5.033557
0.387083
6.252083
0.349983
27.775931
false
false
2024-06-27
2024-06-27
1
ZeusLabs/L3-Aethora-15B-V2 (Merge)
ZhangShenao_SELM-Llama-3-8B-Instruct-iter-3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ZhangShenao/SELM-Llama-3-8B-Instruct-iter-3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ZhangShenao/SELM-Llama-3-8B-Instruct-iter-3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ZhangShenao__SELM-Llama-3-8B-Instruct-iter-3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ZhangShenao/SELM-Llama-3-8B-Instruct-iter-3
9c95ccdeceed14a3c2881bc495101a1acca1385f
24.042938
mit
5
8.03
true
false
false
true
1.311182
0.690282
69.028179
0.504609
29.078531
0.086103
8.610272
0.258389
1.118568
0.38451
5.497135
0.378324
30.924941
false
false
2024-05-25
2024-07-02
3
meta-llama/Meta-Llama-3-8B-Instruct
aaditya_Llama3-OpenBioLLM-70B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/aaditya/Llama3-OpenBioLLM-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aaditya/Llama3-OpenBioLLM-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/aaditya__Llama3-OpenBioLLM-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aaditya/Llama3-OpenBioLLM-70B
5f79deaf38bc5f662943d304d59cb30357e8e5bd
34.97902
llama3
406
70
true
false
false
true
19.314045
0.759674
75.967433
0.639887
47.147075
0.19713
19.712991
0.322987
9.731544
0.441719
14.348177
0.486702
42.966903
false
false
2024-04-24
2024-08-30
2
meta-llama/Meta-Llama-3-70B
abacusai_Dracarys-72B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/abacusai/Dracarys-72B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abacusai/Dracarys-72B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abacusai__Dracarys-72B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
abacusai/Dracarys-72B-Instruct
10cabc4beb57a69df51533f65e39a7ad22821370
43.377212
other
21
72.706
true
false
false
true
24.766928
0.785578
78.557782
0.694407
56.93552
0.396526
39.652568
0.39094
18.791946
0.455823
16.811198
0.545628
49.514258
false
true
2024-08-14
2024-08-16
0
abacusai/Dracarys-72B-Instruct
abacusai_Liberated-Qwen1.5-14B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/abacusai/Liberated-Qwen1.5-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abacusai/Liberated-Qwen1.5-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abacusai__Liberated-Qwen1.5-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
abacusai/Liberated-Qwen1.5-14B
cc0fa5102bfee821bb5e49f082731ccb9d1fedf1
20.508142
other
20
14
true
false
false
true
6.107199
0.363102
36.310212
0.4948
28.020906
0.160121
16.012085
0.283557
4.474273
0.417469
10.316927
0.35123
27.91445
false
true
2024-03-05
2024-09-05
0
abacusai/Liberated-Qwen1.5-14B
abacusai_Llama-3-Smaug-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/abacusai/Llama-3-Smaug-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abacusai/Llama-3-Smaug-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abacusai__Llama-3-Smaug-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
abacusai/Llama-3-Smaug-8B
fe54a7d42160d3d8fcc3289c8c411fd9dd5e8357
19.067762
llama2
89
8.03
true
false
false
true
1.820437
0.486675
48.667535
0.493071
27.880374
0.085347
8.534743
0.248322
0
0.36225
5.047917
0.318484
24.276005
false
true
2024-04-19
2024-07-02
0
abacusai/Llama-3-Smaug-8B
abacusai_Smaug-34B-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/abacusai/Smaug-34B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abacusai/Smaug-34B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abacusai__Smaug-34B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
abacusai/Smaug-34B-v0.1
34d54c65a0247d5eb694973106c816d9c0ad3fc2
24.953218
apache-2.0
60
34.389
true
false
false
true
23.571881
0.501563
50.156252
0.535779
34.261661
0.071752
7.175227
0.329698
10.626398
0.397875
8.134375
0.454289
39.365396
false
true
2024-01-25
2024-06-12
1
jondurbin/bagel-34b-v0.2
abacusai_Smaug-72B-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/abacusai/Smaug-72B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abacusai/Smaug-72B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abacusai__Smaug-72B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
abacusai/Smaug-72B-v0.1
a1d657156f82c24b670158406378648233487011
29.737299
other
468
72.289
true
false
false
false
58.471603
0.5167
51.670013
0.599563
43.1251
0.191088
19.108761
0.323826
9.8434
0.447323
14.415365
0.46235
40.261155
false
true
2024-02-02
2024-06-12
1
moreh/MoMo-72B-lora-1.8.7-DPO
abacusai_Smaug-Llama-3-70B-Instruct-32K_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/abacusai/Smaug-Llama-3-70B-Instruct-32K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abacusai/Smaug-Llama-3-70B-Instruct-32K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abacusai__Smaug-Llama-3-70B-Instruct-32K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
abacusai/Smaug-Llama-3-70B-Instruct-32K
33840982dc253968f32ef3a534ee0e025eb97482
35.764892
llama3
21
70.554
true
false
false
true
26.606826
0.776111
77.611072
0.649311
49.07037
0.274924
27.492447
0.296141
6.152125
0.420792
12.432292
0.476479
41.831043
false
true
2024-06-11
2024-08-06
0
abacusai/Smaug-Llama-3-70B-Instruct-32K
abacusai_Smaug-Mixtral-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/abacusai/Smaug-Mixtral-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abacusai/Smaug-Mixtral-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abacusai__Smaug-Mixtral-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
abacusai/Smaug-Mixtral-v0.1
98fdc8315906b0a8b9e7f24bad89914869fcfc20
23.821471
apache-2.0
12
46.703
true
true
false
true
7.882831
0.555443
55.544289
0.516225
31.919261
0.095166
9.516616
0.301174
6.823266
0.429813
12.993229
0.335189
26.132166
false
true
2024-02-18
2024-08-30
0
abacusai/Smaug-Mixtral-v0.1
abacusai_Smaug-Qwen2-72B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/abacusai/Smaug-Qwen2-72B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abacusai/Smaug-Qwen2-72B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abacusai__Smaug-Qwen2-72B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
abacusai/Smaug-Qwen2-72B-Instruct
af015925946d0c60ef69f512c3b35f421cf8063d
42.07422
other
9
72.706
true
false
false
true
26.51467
0.78253
78.253035
0.690979
56.266172
0.413142
41.314199
0.361577
14.876957
0.440073
15.175781
0.519033
46.559176
false
true
2024-06-26
2024-07-29
0
abacusai/Smaug-Qwen2-72B-Instruct
abacusai_bigstral-12b-32k_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/abacusai/bigstral-12b-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abacusai/bigstral-12b-32k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abacusai__bigstral-12b-32k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
abacusai/bigstral-12b-32k
b78a5385ec1b04d6c97f25e9ba1dff18dc98305f
18.135142
apache-2.0
43
12.476
true
false
false
false
1.930559
0.419381
41.938058
0.470012
25.556902
0.015106
1.510574
0.292785
5.704698
0.455979
15.864063
0.264129
18.236554
true
true
2024-03-06
2024-09-04
1
abacusai/bigstral-12b-32k (Merge)
abacusai_bigyi-15b_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/abacusai/bigyi-15b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abacusai/bigyi-15b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abacusai__bigyi-15b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
abacusai/bigyi-15b
b878c15531f7aaf6cf287530f1117b1308b96dc4
13.051824
other
11
15.058
true
false
false
false
3.742244
0.209403
20.940327
0.43453
19.940223
0.029456
2.945619
0.309564
7.941834
0.353781
4.289323
0.300283
22.25362
true
true
2024-03-06
2024-09-17
1
abacusai/bigyi-15b (Merge)
abhishek_autotrain-0tmgq-5tpbg_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/abhishek/autotrain-0tmgq-5tpbg" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abhishek/autotrain-0tmgq-5tpbg</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abhishek__autotrain-0tmgq-5tpbg-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
abhishek/autotrain-0tmgq-5tpbg
a75e1fda984e009613dca3b7846c579a37ab0673
4.856619
other
0
0.135
true
false
false
true
0.351828
0.195715
19.571515
0.313451
4.268752
0
0
0.251678
0.223714
0.365042
3.396875
0.11511
1.678856
false
false
2024-11-19
2024-12-03
2
HuggingFaceTB/SmolLM2-135M-Instruct (Merge)
abhishek_autotrain-0tmgq-5tpbg_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/abhishek/autotrain-0tmgq-5tpbg" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abhishek/autotrain-0tmgq-5tpbg</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abhishek__autotrain-0tmgq-5tpbg-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
abhishek/autotrain-0tmgq-5tpbg
a75e1fda984e009613dca3b7846c579a37ab0673
5.051545
other
0
0.135
true
false
false
true
0.673608
0.195165
19.516549
0.312733
4.419023
0.01284
1.283988
0.259228
1.230425
0.358375
2.263542
0.114362
1.595745
false
false
2024-11-19
2024-12-04
2
HuggingFaceTB/SmolLM2-135M-Instruct (Merge)
abhishek_autotrain-llama3-70b-orpo-v1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/abhishek/autotrain-llama3-70b-orpo-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abhishek/autotrain-llama3-70b-orpo-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abhishek__autotrain-llama3-70b-orpo-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
abhishek/autotrain-llama3-70b-orpo-v1
053236c6846cc561c1503ba05e2b28c94855a432
14.813377
other
4
70.554
true
false
false
true
21.522056
0.423302
42.330239
0.599799
41.565362
0.010574
1.057402
0.244128
0
0.357906
2.571615
0.112201
1.355644
false
false
2024-05-02
2024-08-30
0
abhishek/autotrain-llama3-70b-orpo-v1
abhishek_autotrain-llama3-70b-orpo-v2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/abhishek/autotrain-llama3-70b-orpo-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abhishek/autotrain-llama3-70b-orpo-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abhishek__autotrain-llama3-70b-orpo-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
abhishek/autotrain-llama3-70b-orpo-v2
a2c16a8a7fa48792eb8a1f0c50e13309c2021a63
28.867313
other
3
70.554
true
false
false
true
25.054093
0.540606
54.060559
0.589947
39.882199
0.210725
21.072508
0.293624
5.816555
0.411333
9.95
0.481799
42.42206
false
false
2024-05-04
2024-08-21
0
abhishek/autotrain-llama3-70b-orpo-v2
abhishek_autotrain-llama3-orpo-v2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/abhishek/autotrain-llama3-orpo-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abhishek/autotrain-llama3-orpo-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abhishek__autotrain-llama3-orpo-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
abhishek/autotrain-llama3-orpo-v2
1655d0683696a5de2eb9a59c339ee469297beb9c
12.276281
other
3
8.03
true
false
false
true
1.81188
0.437166
43.716561
0.315938
4.380134
0.046828
4.682779
0.266779
2.237136
0.37924
5.104948
0.221825
13.536126
false
false
2024-04-22
2024-06-26
0
abhishek/autotrain-llama3-orpo-v2
abhishek_autotrain-vr4a1-e5mms_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/abhishek/autotrain-vr4a1-e5mms" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abhishek/autotrain-vr4a1-e5mms</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abhishek__autotrain-vr4a1-e5mms-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
abhishek/autotrain-vr4a1-e5mms
5206a32e0bd3067aef1ce90f5528ade7d866253f
18.659968
other
0
16.061
true
false
false
false
3.745756
0.214225
21.422492
0.500062
28.456617
0.141239
14.123867
0.319631
9.284116
0.389125
9.040625
0.366689
29.632092
false
false
2024-09-05
2024-09-06
2
meta-llama/Meta-Llama-3.1-8B
abideen_MedPhi-4-14B-v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/abideen/MedPhi-4-14B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">abideen/MedPhi-4-14B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/abideen__MedPhi-4-14B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
abideen/MedPhi-4-14B-v1
2efbd4c10d39ac26cfe89535c5eb38b293b794bb
36.535941
0
14.66
false
false
false
true
1.823721
0.627683
62.768344
0.689678
55.57897
0.293051
29.305136
0.34396
12.527964
0.415458
10.832292
0.533826
48.20294
false
false
2025-01-13
0
Removed
adamo1139_Yi-34B-200K-AEZAKMI-v2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/adamo1139/Yi-34B-200K-AEZAKMI-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">adamo1139/Yi-34B-200K-AEZAKMI-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/adamo1139__Yi-34B-200K-AEZAKMI-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
adamo1139/Yi-34B-200K-AEZAKMI-v2
189b42b0dae6352fbe7165255aae851961c8e678
23.827349
apache-2.0
12
34.389
true
false
false
true
6.043149
0.455526
45.552578
0.538382
35.276425
0.056647
5.664653
0.332215
10.961969
0.388604
6.475521
0.451297
39.032949
false
false
2023-12-13
2024-06-26
0
adamo1139/Yi-34B-200K-AEZAKMI-v2
adriszmar_QAIMath-Qwen2.5-7B-TIES_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/adriszmar/QAIMath-Qwen2.5-7B-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">adriszmar/QAIMath-Qwen2.5-7B-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/adriszmar__QAIMath-Qwen2.5-7B-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
adriszmar/QAIMath-Qwen2.5-7B-TIES
c89bc166dbe2a31c1fceb40ea7acdd96c5620ff5
5.469542
apache-2.0
0
7.616
true
false
false
false
1.283408
0.174632
17.46322
0.312638
5.253691
0
0
0.244966
0
0.409594
9.132552
0.10871
0.96779
true
false
2024-10-27
2024-10-27
0
adriszmar/QAIMath-Qwen2.5-7B-TIES
adriszmar_QAIMath-Qwen2.5-7B-TIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/adriszmar/QAIMath-Qwen2.5-7B-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">adriszmar/QAIMath-Qwen2.5-7B-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/adriszmar__QAIMath-Qwen2.5-7B-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
adriszmar/QAIMath-Qwen2.5-7B-TIES
c89bc166dbe2a31c1fceb40ea7acdd96c5620ff5
4.988442
apache-2.0
0
7.616
true
false
false
false
2.612632
0.168537
16.853726
0.312427
5.019151
0.001511
0.151057
0.249161
0
0.396292
7.169792
0.106632
0.736924
true
false
2024-10-27
2024-10-27
0
adriszmar/QAIMath-Qwen2.5-7B-TIES
aevalone_distill_qw_test_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/aevalone/distill_qw_test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aevalone/distill_qw_test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/aevalone__distill_qw_test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aevalone/distill_qw_test
d8d2b85fcf1ac1170536c12f98892552f184337b
33.684098
apache-2.0
0
7.616
true
false
false
true
0.638374
0.74089
74.088973
0.524575
33.097457
0.478097
47.809668
0.300336
6.711409
0.385969
6.046094
0.409159
34.35099
false
false
2025-03-01
2025-03-05
3
Qwen/Qwen2.5-7B
agentlans_Gemma2-9B-AdvancedFuse_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/agentlans/Gemma2-9B-AdvancedFuse" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">agentlans/Gemma2-9B-AdvancedFuse</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/agentlans__Gemma2-9B-AdvancedFuse-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
agentlans/Gemma2-9B-AdvancedFuse
f7d31619237579b7b473f8ebe87047cf881a33a6
20.43458
gemma
0
9.242
true
false
false
true
3.959093
0.154273
15.427288
0.585937
40.516738
0.100453
10.045317
0.334732
11.297539
0.423083
11.985417
0.400017
33.33518
false
false
2025-01-03
2025-01-21
1
agentlans/Gemma2-9B-AdvancedFuse (Merge)
agentlans_Llama-3.2-1B-Instruct-CrashCourse12K_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/agentlans/Llama-3.2-1B-Instruct-CrashCourse12K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">agentlans/Llama-3.2-1B-Instruct-CrashCourse12K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/agentlans__Llama-3.2-1B-Instruct-CrashCourse12K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
agentlans/Llama-3.2-1B-Instruct-CrashCourse12K
9b32ead43c60e8c3b16ee38f223236d3a44f4aa6
13.437968
llama3.2
0
1.236
true
false
false
true
0.776333
0.539506
53.950629
0.35481
9.387918
0.070997
7.099698
0.240772
0
0.321042
1.196875
0.180934
8.992686
false
false
2025-01-05
2025-01-05
0
agentlans/Llama-3.2-1B-Instruct-CrashCourse12K
agentlans_Llama3.1-8B-drill_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/agentlans/Llama3.1-8B-drill" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">agentlans/Llama3.1-8B-drill</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/agentlans__Llama3.1-8B-drill-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
agentlans/Llama3.1-8B-drill
0e8ce55d45a029e132ea4a09bdfb8736c7434384
26.724907
1
8.03
false
false
false
true
1.382947
0.76517
76.516975
0.501568
28.791683
0.17145
17.145015
0.267617
2.348993
0.36724
4.704948
0.377576
30.841829
false
false
2024-12-27
2024-12-27
1
agentlans/Llama3.1-8B-drill (Merge)
agentlans_Llama3.1-Daredevilish_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/agentlans/Llama3.1-Daredevilish" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">agentlans/Llama3.1-Daredevilish</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/agentlans__Llama3.1-Daredevilish-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
agentlans/Llama3.1-Daredevilish
43e0c3d9792d39b0e41bbe7fa21116751418b7d9
25.570843
llama3.1
1
8.03
true
false
false
true
1.401547
0.629157
62.91573
0.501251
29.20273
0.129154
12.915408
0.301174
6.823266
0.409094
11.603385
0.369681
29.964539
true
false
2025-01-22
2025-01-22
1
agentlans/Llama3.1-Daredevilish (Merge)
agentlans_Llama3.1-Daredevilish-Instruct_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/agentlans/Llama3.1-Daredevilish-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">agentlans/Llama3.1-Daredevilish-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/agentlans__Llama3.1-Daredevilish-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
agentlans/Llama3.1-Daredevilish-Instruct
97acbab4c2412facfda8469521ce3e86baa28e23
29.365204
llama3.1
1
8.03
true
false
false
true
2.914256
0.792597
79.259698
0.523544
32.217512
0.172205
17.220544
0.307047
7.606264
0.391083
7.91875
0.387716
31.968454
true
false
2025-01-22
2025-01-22
1
agentlans/Llama3.1-Daredevilish-Instruct (Merge)
agentlans_Llama3.1-LexiHermes-SuperStorm_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/agentlans/Llama3.1-LexiHermes-SuperStorm" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">agentlans/Llama3.1-LexiHermes-SuperStorm</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/agentlans__Llama3.1-LexiHermes-SuperStorm-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
agentlans/Llama3.1-LexiHermes-SuperStorm
a7e3bdb9308da0a6aa3108b9e14db4e93000be83
29.430987
llama3.1
2
8.03
true
false
false
true
1.238542
0.783455
78.345457
0.526646
32.547493
0.161631
16.163142
0.322987
9.731544
0.39626
8.199219
0.384392
31.599069
true
false
2025-02-12
2025-02-19
1
agentlans/Llama3.1-LexiHermes-SuperStorm (Merge)
agentlans_Llama3.1-SuperDeepFuse_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/agentlans/Llama3.1-SuperDeepFuse" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">agentlans/Llama3.1-SuperDeepFuse</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/agentlans__Llama3.1-SuperDeepFuse-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
agentlans/Llama3.1-SuperDeepFuse
4fbcc81c2c9341f72f921bcc6d17523c62e8b099
27.387201
llama3.1
1
8.03
true
false
false
true
1.345605
0.776161
77.616059
0.504854
29.218387
0.182779
18.277946
0.274329
3.243848
0.369875
5.134375
0.377493
30.832595
true
false
2025-01-20
2025-01-21
1
agentlans/Llama3.1-SuperDeepFuse (Merge)
agentlans_Llama3.1-SuperDeepFuse-CrashCourse12K_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/agentlans/Llama3.1-SuperDeepFuse-CrashCourse12K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">agentlans/Llama3.1-SuperDeepFuse-CrashCourse12K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/agentlans__Llama3.1-SuperDeepFuse-CrashCourse12K-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
agentlans/Llama3.1-SuperDeepFuse-CrashCourse12K
0d38d69d9daeeb59dd837e21e8373a372e3c2226
27.995793
llama3.1
1
8.03
true
false
false
true
1.410392
0.718733
71.873296
0.521551
31.828444
0.180514
18.05136
0.312919
8.389262
0.402646
8.597396
0.363115
29.235003
false
false
2025-01-24
2025-01-24
1
agentlans/Llama3.1-SuperDeepFuse-CrashCourse12K (Merge)
agentlans_Qwen2.5-0.5B-Instruct-CrashCourse-dropout_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/agentlans/Qwen2.5-0.5B-Instruct-CrashCourse-dropout" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">agentlans/Qwen2.5-0.5B-Instruct-CrashCourse-dropout</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/agentlans__Qwen2.5-0.5B-Instruct-CrashCourse-dropout-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
agentlans/Qwen2.5-0.5B-Instruct-CrashCourse-dropout
09acebcae7826ca66d950641a92a2732d3cf49eb
8.433362
apache-2.0
0
0.494
true
false
false
true
0.918546
0.294883
29.488313
0.331173
7.227868
0.042296
4.229607
0.263423
1.789709
0.334188
1.106771
0.160821
6.757905
false
false
2025-01-01
2025-01-01
1
agentlans/Qwen2.5-0.5B-Instruct-CrashCourse-dropout (Merge)