eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
64 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.09k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.22
0.83
BBH
float64
0.25
76.7
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.7
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
525 values
Submission Date
stringclasses
263 values
Generation
int64
0
10
Base Model
stringlengths
4
102
HeraiHench_DeepSeek-R1-Qwen-Coder-8B_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/HeraiHench/DeepSeek-R1-Qwen-Coder-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HeraiHench/DeepSeek-R1-Qwen-Coder-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HeraiHench__DeepSeek-R1-Qwen-Coder-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HeraiHench/DeepSeek-R1-Qwen-Coder-8B
b4e6fe291d5c1d61805f31ae42742d81cf7cd594
4.563388
0
8.164
false
false
false
false
0.642793
0.186947
18.69473
0.291344
2.147967
0
0
0.260067
1.342282
0.373844
3.830469
0.112284
1.364879
false
false
2025-01-29
0
Removed
HeraiHench_Double-Down-Qwen-Math-7B_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/HeraiHench/Double-Down-Qwen-Math-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HeraiHench/Double-Down-Qwen-Math-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HeraiHench__Double-Down-Qwen-Math-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HeraiHench/Double-Down-Qwen-Math-7B
339e6488b9bc9cb2874e91396a984ee96865a34a
4.309786
0
7.616
false
false
false
false
1.399767
0.166964
16.696366
0.284461
1.95487
0.000755
0.075529
0.265101
2.013423
0.373656
3.873698
0.111203
1.244829
false
false
2025-01-30
0
Removed
HeraiHench_Marge-Qwen-Math-7B_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/HeraiHench/Marge-Qwen-Math-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HeraiHench/Marge-Qwen-Math-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HeraiHench__Marge-Qwen-Math-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HeraiHench/Marge-Qwen-Math-7B
f6c2071d34616d61474b9727f1151de56f566e93
4.083812
0
7.616
false
false
false
false
1.373742
0.126222
12.622176
0.306885
3.363509
0.005287
0.528701
0.239094
0
0.393906
7.371615
0.105552
0.616874
false
false
2025-01-30
0
Removed
HeraiHench_Phi-4-slerp-ReasoningRP-14B_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HeraiHench/Phi-4-slerp-ReasoningRP-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HeraiHench/Phi-4-slerp-ReasoningRP-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HeraiHench__Phi-4-slerp-ReasoningRP-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HeraiHench/Phi-4-slerp-ReasoningRP-14B
2763ec1a660dad11fab64eed0507e5d24e71806d
8.512387
0
9.207
false
false
false
false
1.311199
0.157546
15.754642
0.419572
18.885376
0
0
0.293624
5.816555
0.311615
0.61849
0.189993
9.999261
false
false
2025-01-29
0
Removed
HiroseKoichi_Llama-Salad-4x8B-V3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/HiroseKoichi/Llama-Salad-4x8B-V3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HiroseKoichi/Llama-Salad-4x8B-V3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HiroseKoichi__Llama-Salad-4x8B-V3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HiroseKoichi/Llama-Salad-4x8B-V3
a343915429779efbd1478f01ba1f7fd9d8d226c0
24.922702
llama3
6
24.942
true
true
false
true
4.27539
0.665352
66.535238
0.524465
31.928849
0.095921
9.592145
0.302852
7.04698
0.374031
6.453906
0.351812
27.979093
true
false
2024-06-17
2024-06-26
0
HiroseKoichi/Llama-Salad-4x8B-V3
HoangHa_Pensez-Llama3.1-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HoangHa/Pensez-Llama3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HoangHa/Pensez-Llama3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HoangHa__Pensez-Llama3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HoangHa/Pensez-Llama3.1-8B
e7eab5cd6096c33f1f08a36a05d0ac83c7d950b1
19.048046
llama3.1
2
8.03
true
false
false
true
2.232677
0.388681
38.868092
0.466913
24.845158
0.114804
11.480363
0.288591
5.145414
0.359698
10.328906
0.312583
23.620346
false
false
2025-02-19
2025-02-19
0
HoangHa/Pensez-Llama3.1-8B
HuggingFaceH4_zephyr-7b-alpha_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceH4/zephyr-7b-alpha" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceH4/zephyr-7b-alpha</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceH4__zephyr-7b-alpha-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceH4/zephyr-7b-alpha
2ce2d025864af849b3e5029e2ec9d568eeda892d
18.598795
mit
1,108
7.242
true
false
false
true
0.903054
0.519148
51.914808
0.458286
23.890291
0.019637
1.963746
0.297819
6.375839
0.394958
7.503125
0.279505
19.944962
false
true
2023-10-09
2024-06-12
1
mistralai/Mistral-7B-v0.1
HuggingFaceH4_zephyr-7b-beta_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceH4/zephyr-7b-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceH4/zephyr-7b-beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceH4__zephyr-7b-beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceH4/zephyr-7b-beta
b70e0c9a2d9e14bd1e812d3c398e5f313e93b473
17.792237
mit
1,667
7.242
true
false
false
true
1.110046
0.495043
49.504315
0.431582
21.487542
0.028701
2.870091
0.290268
5.369128
0.392542
7.734375
0.278092
19.787973
false
true
2023-10-26
2024-06-12
1
mistralai/Mistral-7B-v0.1
HuggingFaceH4_zephyr-7b-gemma-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceH4/zephyr-7b-gemma-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceH4/zephyr-7b-gemma-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceH4__zephyr-7b-gemma-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceH4/zephyr-7b-gemma-v0.1
03b3427d0ed07d2e0f86c0a7e53d82d4beef9540
16.030043
other
124
8.538
true
false
false
true
2.860948
0.336374
33.637415
0.462374
23.751163
0.081571
8.1571
0.294463
5.928412
0.373969
4.179427
0.284741
20.526743
false
true
2024-03-01
2024-06-12
2
google/gemma-7b
HuggingFaceH4_zephyr-orpo-141b-A35b-v0.1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceH4__zephyr-orpo-141b-A35b-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceH4/zephyr-orpo-141b-A35b-v0.1
a3be084543d278e61b64cd600f28157afc79ffd3
34.125963
apache-2.0
266
140.621
true
false
false
true
84.135573
0.651089
65.108911
0.629044
47.503796
0.204683
20.468278
0.378356
17.114094
0.446521
14.715104
0.45861
39.845597
false
true
2024-04-10
2024-06-12
1
mistral-community/Mixtral-8x22B-v0.1
HuggingFaceTB_SmolLM-1.7B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-1.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-1.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-1.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM-1.7B
673a07602ca1191e5bc2ddda428e2f608a0a14c0
5.576456
apache-2.0
170
1.71
true
false
false
false
0.648615
0.236157
23.615673
0.318052
4.411128
0.016616
1.661631
0.241611
0
0.342094
2.128385
0.114777
1.641918
false
true
2024-07-14
2024-07-18
0
HuggingFaceTB/SmolLM-1.7B
HuggingFaceTB_SmolLM-1.7B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-1.7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-1.7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-1.7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM-1.7B-Instruct
0ad161e59935a9a691dfde2818df8b98786f30a7
5.490689
apache-2.0
110
1.71
true
false
false
true
0.634045
0.234783
23.47826
0.288511
2.080374
0.021148
2.114804
0.260067
1.342282
0.348667
2.083333
0.116606
1.84508
false
true
2024-07-15
2024-07-18
1
HuggingFaceTB/SmolLM-1.7B
HuggingFaceTB_SmolLM-135M_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-135M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-135M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-135M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM-135M
eec6e461571fba3e197a57c298f60b75422eae02
6.95149
apache-2.0
197
0.13
true
false
false
false
0.686755
0.212476
21.247623
0.304605
3.2854
0.013595
1.359517
0.258389
1.118568
0.436604
13.342188
0.112201
1.355644
false
true
2024-07-14
2024-07-18
0
HuggingFaceTB/SmolLM-135M
HuggingFaceTB_SmolLM-135M-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-135M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-135M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-135M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM-135M-Instruct
8ca7af58e27777cae460ad8ca3ab9db15f5c160d
3.652288
apache-2.0
109
0.135
true
false
false
true
0.59726
0.121401
12.140122
0.301508
2.692958
0.005287
0.528701
0.259228
1.230425
0.363458
3.365625
0.117603
1.955895
false
true
2024-07-15
2024-10-12
1
HuggingFaceTB/SmolLM-135M
HuggingFaceTB_SmolLM-360M_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-360M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-360M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-360M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM-360M
318cc630b73730bfd712e5873063156ffb8936b5
6.260889
apache-2.0
62
0.36
true
false
false
false
0.730519
0.213351
21.335058
0.306452
3.284915
0.011329
1.132931
0.267617
2.348993
0.401781
8.089323
0.112367
1.374113
false
true
2024-07-14
2024-07-18
0
HuggingFaceTB/SmolLM-360M
HuggingFaceTB_SmolLM-360M-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM-360M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM-360M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM-360M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM-360M-Instruct
8e951de8c220295ea4f85d078c4e320df7137535
5.008899
apache-2.0
80
0.362
true
false
false
true
0.733002
0.195165
19.516549
0.288511
2.080374
0.018127
1.812689
0.264262
1.901566
0.347177
2.897135
0.116606
1.84508
false
true
2024-07-15
2024-08-20
1
HuggingFaceTB/SmolLM-360M
HuggingFaceTB_SmolLM2-1.7B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-1.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-1.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-1.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM2-1.7B
4fa12cab4f5f53670b05125fb9d2873af587d231
9.583621
apache-2.0
113
1.71
true
false
false
false
0.650052
0.244
24.400036
0.345259
9.301788
0.026435
2.643505
0.279362
3.914989
0.348542
4.601042
0.213763
12.640366
false
true
2024-10-30
2024-11-06
0
HuggingFaceTB/SmolLM2-1.7B
HuggingFaceTB_SmolLM2-1.7B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-1.7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-1.7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-1.7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM2-1.7B-Instruct
d1bb90bcfbe0f211109880f4da18da66f229c4f6
15.022278
apache-2.0
580
1.711
true
false
false
true
0.939721
0.536784
53.678351
0.359862
10.917989
0.058157
5.81571
0.279362
3.914989
0.342125
4.098958
0.205369
11.707668
false
true
2024-10-31
2024-11-06
1
HuggingFaceTB/SmolLM2-1.7B-Instruct (Merge)
HuggingFaceTB_SmolLM2-135M_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-135M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-135M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-135M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM2-135M
28e66ca6931668447a3bac213f23d990ad3b0e2b
5.695927
apache-2.0
70
0.135
true
false
false
false
0.677924
0.181777
18.177658
0.304423
3.708078
0.012085
1.208459
0.248322
0
0.411177
10.030469
0.109458
1.050901
false
true
2024-10-31
2024-11-06
0
HuggingFaceTB/SmolLM2-135M
HuggingFaceTB_SmolLM2-135M-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-135M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-135M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-135M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM2-135M-Instruct
5a33ba103645800d7b3790c4448546c1b73efc71
6.467365
apache-2.0
158
0.135
true
false
false
true
0.338376
0.288314
28.83139
0.312432
4.720808
0.003021
0.302115
0.235738
0
0.366219
3.677344
0.111453
1.272533
false
true
2024-10-31
2024-11-06
1
HuggingFaceTB/SmolLM2-135M-Instruct (Merge)
HuggingFaceTB_SmolLM2-135M-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-135M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-135M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-135M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM2-135M-Instruct
5a33ba103645800d7b3790c4448546c1b73efc71
3.206597
apache-2.0
158
0.135
true
false
false
false
0.697508
0.059252
5.925167
0.313475
4.796276
0.01435
1.435045
0.23406
0
0.387146
6.059896
0.109209
1.023197
false
true
2024-10-31
2024-11-14
1
HuggingFaceTB/SmolLM2-135M-Instruct (Merge)
HuggingFaceTB_SmolLM2-360M_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-360M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-360M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-360M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM2-360M
3ce05f63c246c44616da500b47b01f082f4d3bcc
6.251282
apache-2.0
40
0.36
true
false
false
false
0.773316
0.211452
21.145228
0.323348
5.543603
0.012085
1.208459
0.245805
0
0.395427
7.728385
0.116938
1.882018
false
true
2024-10-31
2024-11-06
0
HuggingFaceTB/SmolLM2-360M
HuggingFaceTB_SmolLM2-360M-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-360M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-360M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-360M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM2-360M-Instruct
4873f67095301d304753fae05bc09ec766634e50
3.10002
apache-2.0
102
0.362
true
false
false
false
0.392382
0.083032
8.303191
0.30527
3.299047
0.008308
0.830816
0.265101
2.013423
0.342281
2.751823
0.112616
1.401817
false
true
2024-10-31
2024-11-14
0
HuggingFaceTB/SmolLM2-360M-Instruct
HuggingFaceTB_SmolLM2-360M-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HuggingFaceTB/SmolLM2-360M-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HuggingFaceTB/SmolLM2-360M-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HuggingFaceTB__SmolLM2-360M-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HuggingFaceTB/SmolLM2-360M-Instruct
4873f67095301d304753fae05bc09ec766634e50
8.139566
apache-2.0
102
0.36
true
false
false
true
0.751639
0.38416
38.415959
0.314351
4.173864
0.015106
1.510574
0.255034
0.671141
0.346125
2.765625
0.111702
1.300236
false
true
2024-10-31
2024-11-06
0
HuggingFaceTB/SmolLM2-360M-Instruct
HumanLLMs_Humanish-LLama3-8B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/HumanLLMs/Humanish-LLama3-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HumanLLMs/Humanish-LLama3-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HumanLLMs__Humanish-LLama3-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HumanLLMs/Humanish-LLama3-8B-Instruct
42f73ada2b7fb16f18a75404d72b7911bf1e65ce
22.678204
llama3
18
8.03
true
false
false
true
1.496556
0.64979
64.979033
0.496771
28.012477
0.102719
10.271903
0.255872
0.782998
0.358156
2.002865
0.37018
30.019947
false
false
2024-10-04
2024-10-05
1
meta-llama/Meta-Llama-3-8B-Instruct
HumanLLMs_Humanish-Mistral-Nemo-Instruct-2407_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/HumanLLMs/Humanish-Mistral-Nemo-Instruct-2407" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HumanLLMs/Humanish-Mistral-Nemo-Instruct-2407</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HumanLLMs__Humanish-Mistral-Nemo-Instruct-2407-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HumanLLMs/Humanish-Mistral-Nemo-Instruct-2407
45b80bdce8d447ef494af06751904afcc607eb37
23.888068
apache-2.0
13
12.248
true
false
false
true
3.240567
0.545127
54.512693
0.526178
32.709613
0.136707
13.670695
0.287752
5.033557
0.39676
9.395052
0.352061
28.006797
false
false
2024-10-06
2024-10-06
2
mistralai/Mistral-Nemo-Base-2407
HumanLLMs_Humanish-Qwen2.5-7B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/HumanLLMs/Humanish-Qwen2.5-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">HumanLLMs/Humanish-Qwen2.5-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/HumanLLMs__Humanish-Qwen2.5-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
HumanLLMs/Humanish-Qwen2.5-7B-Instruct
7d2c71d926832d6e257ad2776011494dbac2d151
34.998707
apache-2.0
11
7.616
true
false
false
true
2.386785
0.728425
72.842502
0.536368
34.478998
0.5
50
0.298658
6.487696
0.398063
8.424479
0.439827
37.75857
false
false
2024-10-05
2024-10-05
2
Qwen/Qwen2.5-7B
IDEA-CCNL_Ziya-LLaMA-13B-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/IDEA-CCNL/Ziya-LLaMA-13B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">IDEA-CCNL/Ziya-LLaMA-13B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/IDEA-CCNL__Ziya-LLaMA-13B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
IDEA-CCNL/Ziya-LLaMA-13B-v1
64d931f346e1a49ea3bbca07a83137075bab1c66
3.906425
gpl-3.0
274
13
true
false
false
false
2.216515
0.169686
16.968643
0.287703
1.463617
0
0
0.249161
0
0.375052
3.88151
0.110123
1.124778
false
true
2023-05-16
2024-06-12
0
IDEA-CCNL/Ziya-LLaMA-13B-v1
INSAIT-Institute_BgGPT-Gemma-2-27B-IT-v1.0_float16
float16
🟩 continuously pretrained
🟩
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/INSAIT-Institute/BgGPT-Gemma-2-27B-IT-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">INSAIT-Institute/BgGPT-Gemma-2-27B-IT-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/INSAIT-Institute__BgGPT-Gemma-2-27B-IT-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
INSAIT-Institute/BgGPT-Gemma-2-27B-IT-v1.0
2ce5574f5d0daf61b39cffd80023dd73782b87e3
1.678063
gemma
15
27.227
true
false
false
true
20.726244
0
0
0.291178
2.347041
0
0
0.260067
1.342282
0.357531
4.52474
0.116689
1.854314
false
false
2024-11-15
2024-12-15
1
INSAIT-Institute/BgGPT-Gemma-2-27B-IT-v1.0 (Merge)
IlyaGusev_gemma-2-2b-it-abliterated_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/IlyaGusev/gemma-2-2b-it-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">IlyaGusev/gemma-2-2b-it-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/IlyaGusev__gemma-2-2b-it-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
IlyaGusev/gemma-2-2b-it-abliterated
16.705746
gemma
49
2.614
true
false
false
true
3.687173
0.533087
53.308665
0.41186
16.796335
0.061178
6.117825
0.265101
2.013423
0.378187
4.906771
0.253823
17.09146
false
false
2024-07-31
2025-01-07
0
IlyaGusev/gemma-2-2b-it-abliterated
IlyaGusev_gemma-2-9b-it-abliterated_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/IlyaGusev/gemma-2-9b-it-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">IlyaGusev/gemma-2-9b-it-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/IlyaGusev__gemma-2-9b-it-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
IlyaGusev/gemma-2-9b-it-abliterated
e2b6426b20a3a889f0c182056b0dbbb7fa585d25
31.29423
gemma
37
9.242
true
false
false
true
2.622688
0.747259
74.725949
0.590633
40.824685
0.177492
17.749245
0.345638
12.751678
0.403365
9.320573
0.391539
32.393248
false
false
2024-07-13
2025-01-07
0
IlyaGusev/gemma-2-9b-it-abliterated
Infinirc_Infinirc-Llama3-8B-2G-Release-v1.0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Infinirc/Infinirc-Llama3-8B-2G-Release-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Infinirc/Infinirc-Llama3-8B-2G-Release-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Infinirc__Infinirc-Llama3-8B-2G-Release-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Infinirc/Infinirc-Llama3-8B-2G-Release-v1.0
9c542d9ec3f86e145ae445c200c6ebe9066e8cd6
13.162662
llama3
1
8.03
true
false
false
false
2.716201
0.202434
20.243399
0.435074
20.831165
0.016616
1.661631
0.299497
6.599553
0.460938
16.750521
0.216007
12.889702
false
false
2024-06-26
2024-09-29
0
Infinirc/Infinirc-Llama3-8B-2G-Release-v1.0
Intel_neural-chat-7b-v3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Intel/neural-chat-7b-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Intel/neural-chat-7b-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Intel__neural-chat-7b-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Intel/neural-chat-7b-v3
fc679274dfcd28a8b6087634f71af7ed2a0659c4
18.069527
apache-2.0
67
7
true
false
false
false
0.978581
0.277797
27.779736
0.504832
30.205692
0.029456
2.945619
0.291946
5.592841
0.50549
23.019531
0.269864
18.873744
false
true
2023-10-25
2024-06-12
1
mistralai/Mistral-7B-v0.1
Intel_neural-chat-7b-v3-1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Intel/neural-chat-7b-v3-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Intel/neural-chat-7b-v3-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Intel__neural-chat-7b-v3-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Intel/neural-chat-7b-v3-1
c0d379a49c1c0579529d5e6f2e936ddb759552a8
21.067927
apache-2.0
546
7.242
true
false
false
false
1.127384
0.46869
46.868974
0.505157
29.739752
0.035498
3.549849
0.290268
5.369128
0.497896
22.236979
0.267786
18.642878
false
true
2023-11-14
2024-06-12
1
mistralai/Mistral-7B-v0.1
Intel_neural-chat-7b-v3-2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Intel/neural-chat-7b-v3-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Intel/neural-chat-7b-v3-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Intel__neural-chat-7b-v3-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Intel/neural-chat-7b-v3-2
0d8f77647810d21d935ea90c66d6339b85e65a75
21.471411
apache-2.0
57
7
true
false
false
false
1.120883
0.49884
49.883975
0.503223
30.237458
0.047583
4.758308
0.290268
5.369128
0.489521
20.056771
0.266705
18.522828
false
true
2023-11-21
2024-06-12
0
Intel/neural-chat-7b-v3-2
Intel_neural-chat-7b-v3-3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Intel/neural-chat-7b-v3-3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Intel/neural-chat-7b-v3-3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Intel__neural-chat-7b-v3-3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Intel/neural-chat-7b-v3-3
bdd31cf498d13782cc7497cba5896996ce429f91
20.557586
apache-2.0
79
7
true
false
false
false
1.119048
0.476259
47.625855
0.487662
27.753851
0.040785
4.07855
0.28943
5.257271
0.485958
20.578125
0.262467
18.051862
false
true
2023-12-09
2024-06-12
2
mistralai/Mistral-7B-v0.1
IntervitensInc_internlm2_5-20b-llamafied_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/IntervitensInc/internlm2_5-20b-llamafied" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">IntervitensInc/internlm2_5-20b-llamafied</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/IntervitensInc__internlm2_5-20b-llamafied-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
IntervitensInc/internlm2_5-20b-llamafied
0b6fc3cc0b9bf3529816061eb508483c20b77fe9
29.216881
apache-2.0
4
19.861
true
false
false
false
2.762256
0.340995
34.099523
0.747847
63.47058
0.17145
17.145015
0.338087
11.744966
0.447542
14.942708
0.405086
33.898493
false
false
2024-08-06
2024-11-11
0
IntervitensInc/internlm2_5-20b-llamafied
Invalid-Null_PeiYangMe-0.5_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Invalid-Null/PeiYangMe-0.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Invalid-Null/PeiYangMe-0.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Invalid-Null__PeiYangMe-0.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Invalid-Null/PeiYangMe-0.5
34e2c17d7bc7b34bd744ce5466046c04db2cf367
3.427367
apache-2.0
0
6.061
true
false
false
false
1.515357
0.140885
14.088507
0.279077
1.474577
0
0
0.244128
0
0.373812
3.793229
0.110871
1.20789
false
false
2024-12-28
2024-12-28
1
Invalid-Null/PeiYangMe-0.5 (Merge)
Invalid-Null_PeiYangMe-0.7_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Invalid-Null/PeiYangMe-0.7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Invalid-Null/PeiYangMe-0.7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Invalid-Null__PeiYangMe-0.7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Invalid-Null/PeiYangMe-0.7
b30d72771b170eea4eeab447ce2d62696a292e02
4.39728
apache-2.0
0
6.061
true
false
false
false
1.493281
0.149103
14.910327
0.302753
3.600798
0.011329
1.132931
0.233221
0
0.385719
5.614844
0.110123
1.124778
false
false
2024-12-28
2024-12-28
1
Invalid-Null/PeiYangMe-0.7 (Merge)
Isaak-Carter_JOSIEv4o-8b-stage1-v4_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Isaak-Carter/JOSIEv4o-8b-stage1-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Isaak-Carter/JOSIEv4o-8b-stage1-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Isaak-Carter__JOSIEv4o-8b-stage1-v4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Isaak-Carter/JOSIEv4o-8b-stage1-v4
a8380a7be51b547761824e524b3d95ac73203122
15.668082
apache-2.0
1
8.03
true
false
false
false
1.781164
0.255266
25.526603
0.472497
25.787276
0.05287
5.287009
0.291946
5.592841
0.365438
6.079687
0.331616
25.735077
false
false
2024-08-03
2024-08-03
0
Isaak-Carter/JOSIEv4o-8b-stage1-v4
Isaak-Carter_JOSIEv4o-8b-stage1-v4_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Isaak-Carter/JOSIEv4o-8b-stage1-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Isaak-Carter/JOSIEv4o-8b-stage1-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Isaak-Carter__JOSIEv4o-8b-stage1-v4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Isaak-Carter/JOSIEv4o-8b-stage1-v4
a8380a7be51b547761824e524b3d95ac73203122
15.419272
apache-2.0
1
8.03
true
false
false
false
0.879882
0.247697
24.769722
0.475807
25.919578
0.045317
4.531722
0.291107
5.480984
0.364104
6.346354
0.329205
25.467272
false
false
2024-08-03
2024-08-03
0
Isaak-Carter/JOSIEv4o-8b-stage1-v4
Isaak-Carter_Josiefied-Qwen2.5-7B-Instruct-abliterated_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Isaak-Carter/Josiefied-Qwen2.5-7B-Instruct-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Isaak-Carter/Josiefied-Qwen2.5-7B-Instruct-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Isaak-Carter__Josiefied-Qwen2.5-7B-Instruct-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Isaak-Carter/Josiefied-Qwen2.5-7B-Instruct-abliterated
879168f9ce9fac315a19dd4f4c7df5253bb660f2
35.064748
0
7.616
false
false
false
true
2.153581
0.731747
73.174732
0.539638
34.904316
0.492447
49.244713
0.302852
7.04698
0.408667
9.616667
0.42761
36.401079
false
false
2024-09-21
0
Removed
Isaak-Carter_Josiefied-Qwen2.5-7B-Instruct-abliterated-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Isaak-Carter/Josiefied-Qwen2.5-7B-Instruct-abliterated-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Isaak-Carter/Josiefied-Qwen2.5-7B-Instruct-abliterated-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Isaak-Carter__Josiefied-Qwen2.5-7B-Instruct-abliterated-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Isaak-Carter/Josiefied-Qwen2.5-7B-Instruct-abliterated-v2
5d07f58562422feb9f25c9c048e40356d2cf7e4b
35.685533
apache-2.0
6
7.616
true
false
false
true
2.261829
0.784104
78.410396
0.531092
33.29454
0.472054
47.205438
0.298658
6.487696
0.435396
13.957813
0.412816
34.757314
false
false
2024-09-20
2024-09-21
1
Qwen/Qwen2.5-7B
J-LAB_Thynk_orpo_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/J-LAB/Thynk_orpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">J-LAB/Thynk_orpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/J-LAB__Thynk_orpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
J-LAB/Thynk_orpo
c6606d402f26d005b9f1a71a1cde9139d1cffb2a
17.263934
0
3.086
false
false
false
false
2.429528
0.210178
21.017788
0.446311
22.062784
0.148036
14.803625
0.292785
5.704698
0.451479
15.201563
0.323138
24.793144
false
false
2024-10-14
0
Removed
JackFram_llama-160m_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/JackFram/llama-160m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JackFram/llama-160m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JackFram__llama-160m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JackFram/llama-160m
aca9b687d1425f863dcf5de9a4c96e3fe36266dd
4.73813
apache-2.0
34
0.162
true
false
false
false
0.186949
0.179104
17.910367
0.288802
2.033606
0.008308
0.830816
0.261745
1.565996
0.379208
4.667708
0.112783
1.420287
false
false
2023-05-26
2024-11-30
0
JackFram/llama-160m
JackFram_llama-68m_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/JackFram/llama-68m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JackFram/llama-68m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JackFram__llama-68m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JackFram/llama-68m
964a5d77df908b69f8d6476fb70e940425b04cb5
4.96334
apache-2.0
26
0.068
true
false
false
false
0.121116
0.172634
17.263417
0.29363
2.591048
0.006042
0.60423
0.258389
1.118568
0.39099
6.607031
0.114362
1.595745
false
false
2023-07-19
2024-11-30
0
JackFram/llama-68m
Jacoby746_Casual-Magnum-34B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Jacoby746/Casual-Magnum-34B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jacoby746/Casual-Magnum-34B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jacoby746__Casual-Magnum-34B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jacoby746/Casual-Magnum-34B
b628c6959441db75460cfd49536322b1ea46130e
23.797921
apache-2.0
1
34.389
true
false
false
false
6.853394
0.193017
19.301675
0.603205
43.051568
0.092145
9.214502
0.372483
16.331096
0.40776
8.403385
0.518368
46.485298
true
false
2024-10-01
2024-10-23
1
Jacoby746/Casual-Magnum-34B (Merge)
Jacoby746_Inf-Silent-Kunoichi-v0.1-2x7B_float16
float16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/Jacoby746/Inf-Silent-Kunoichi-v0.1-2x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jacoby746/Inf-Silent-Kunoichi-v0.1-2x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jacoby746__Inf-Silent-Kunoichi-v0.1-2x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jacoby746/Inf-Silent-Kunoichi-v0.1-2x7B
9ab68beb6fe16cab2ab708b9af4417c89751d297
20.186181
apache-2.0
0
12.879
true
false
false
false
2.780861
0.387982
38.798167
0.518546
32.387004
0.070997
7.099698
0.28943
5.257271
0.428042
12.338542
0.327128
25.236407
false
false
2024-09-19
2024-09-20
1
Jacoby746/Inf-Silent-Kunoichi-v0.1-2x7B (Merge)
Jacoby746_Inf-Silent-Kunoichi-v0.2-2x7B_float16
float16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/Jacoby746/Inf-Silent-Kunoichi-v0.2-2x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jacoby746/Inf-Silent-Kunoichi-v0.2-2x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jacoby746__Inf-Silent-Kunoichi-v0.2-2x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jacoby746/Inf-Silent-Kunoichi-v0.2-2x7B
711263c24f812676eb382a31a5f0fed9bd8c16e4
20.018228
apache-2.0
0
12.879
true
false
false
false
1.732529
0.363602
36.360191
0.520942
32.259184
0.062689
6.268882
0.300336
6.711409
0.431979
13.264062
0.327211
25.245641
false
false
2024-09-19
2024-09-21
1
Jacoby746/Inf-Silent-Kunoichi-v0.2-2x7B (Merge)
Jacoby746_Proto-Athena-4x7B_float16
float16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/Jacoby746/Proto-Athena-4x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jacoby746/Proto-Athena-4x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jacoby746__Proto-Athena-4x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jacoby746/Proto-Athena-4x7B
450fcba7a630fb61a662f71936d37979226fced8
19.775578
apache-2.0
0
24.154
true
false
false
false
3.353229
0.370296
37.029637
0.510655
30.870823
0.064955
6.495468
0.294463
5.928412
0.434771
13.813021
0.320645
24.516105
false
false
2024-09-21
2024-09-21
1
Jacoby746/Proto-Athena-4x7B (Merge)
Jacoby746_Proto-Athena-v0.2-4x7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/Jacoby746/Proto-Athena-v0.2-4x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jacoby746/Proto-Athena-v0.2-4x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jacoby746__Proto-Athena-v0.2-4x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jacoby746/Proto-Athena-v0.2-4x7B
01feeded217ea83a8794e7968c8850859b5f0b14
19.345307
apache-2.0
0
24.154
true
false
false
false
3.302744
0.375242
37.524214
0.506773
30.340844
0.063444
6.344411
0.298658
6.487696
0.421281
10.960156
0.319731
24.414524
false
false
2024-09-21
2024-09-21
1
Jacoby746/Proto-Athena-v0.2-4x7B (Merge)
Jacoby746_Proto-Harpy-Blazing-Light-v0.1-2x7B_float16
float16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/Jacoby746/Proto-Harpy-Blazing-Light-v0.1-2x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jacoby746/Proto-Harpy-Blazing-Light-v0.1-2x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jacoby746__Proto-Harpy-Blazing-Light-v0.1-2x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jacoby746/Proto-Harpy-Blazing-Light-v0.1-2x7B
bbb5d7c7a0c9e999e057ffa71eaa93d59d95b36b
22.481214
0
12.879
false
false
false
false
1.763682
0.490472
49.047195
0.518685
32.63253
0.074773
7.477341
0.295302
6.040268
0.444969
14.121094
0.33012
25.568853
false
false
2024-09-22
2024-09-30
1
Jacoby746/Proto-Harpy-Blazing-Light-v0.1-2x7B (Merge)
Jacoby746_Proto-Harpy-Spark-v0.1-7B_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Jacoby746/Proto-Harpy-Spark-v0.1-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Jacoby746/Proto-Harpy-Spark-v0.1-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Jacoby746__Proto-Harpy-Spark-v0.1-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Jacoby746/Proto-Harpy-Spark-v0.1-7B
984cca02cd930b2e1b7b2a7d53471d32d9821cdd
19.85
apache-2.0
0
7.242
true
false
false
false
1.191609
0.433269
43.326928
0.473577
26.91311
0.061934
6.193353
0.305369
7.38255
0.431667
12.291667
0.306932
22.992391
true
false
2024-09-22
2024-09-30
1
Jacoby746/Proto-Harpy-Spark-v0.1-7B (Merge)
JayHyeon_Qwen-0.5B-DPO-1epoch_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen-0.5B-DPO-1epoch" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen-0.5B-DPO-1epoch</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen-0.5B-DPO-1epoch-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen-0.5B-DPO-1epoch
f5569969d307d193798eff52c0527e23f4ac8bb9
7.385733
mit
0
0.494
true
false
false
true
0.956513
0.264733
26.473313
0.319075
5.543695
0.028701
2.870091
0.252517
0.33557
0.335177
2.897135
0.155751
6.194592
false
false
2024-12-26
2024-12-26
0
JayHyeon/Qwen-0.5B-DPO-1epoch
JayHyeon_Qwen-0.5B-DPO-5epoch_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen-0.5B-DPO-5epoch" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen-0.5B-DPO-5epoch</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen-0.5B-DPO-5epoch-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen-0.5B-DPO-5epoch
4363737d67e793b7cfb714dda4aa27677a4db6e4
7.198583
mit
1
0.494
true
false
false
true
1.410732
0.257015
25.701472
0.311211
5.056692
0.04003
4.003021
0.243289
0
0.337969
2.51276
0.153258
5.917553
false
false
2024-12-26
2024-12-26
0
JayHyeon/Qwen-0.5B-DPO-5epoch
JayHyeon_Qwen-0.5B-IRPO-1epoch_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen-0.5B-IRPO-1epoch" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen-0.5B-IRPO-1epoch</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen-0.5B-IRPO-1epoch-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen-0.5B-IRPO-1epoch
2dc73651ff3cbf0e4638c3bd5b1d87cfe2afc15f
7.031893
mit
0
0.494
true
false
false
true
0.990887
0.258913
25.891302
0.316382
5.324352
0.031722
3.172205
0.246644
0
0.328635
2.246094
0.150017
5.557402
false
false
2024-12-26
2024-12-26
0
JayHyeon/Qwen-0.5B-IRPO-1epoch
JayHyeon_Qwen-0.5B-IRPO-5epoch_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen-0.5B-IRPO-5epoch" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen-0.5B-IRPO-5epoch</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen-0.5B-IRPO-5epoch-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen-0.5B-IRPO-5epoch
dca128b2490982a6f2d53d017ad44c1b7829fabe
6.923469
mit
0
0.494
true
false
false
true
1.463051
0.248671
24.86713
0.318917
5.711334
0.032477
3.247734
0.239933
0
0.328667
2.083333
0.150682
5.63128
false
false
2024-12-26
2024-12-26
0
JayHyeon/Qwen-0.5B-IRPO-5epoch
JayHyeon_Qwen-0.5B-eDPO-1epoch_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen-0.5B-eDPO-1epoch" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen-0.5B-eDPO-1epoch</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen-0.5B-eDPO-1epoch-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen-0.5B-eDPO-1epoch
d24f341c6034334f397c156593ac8eece0a8a6ff
7.280997
mit
0
0.494
true
false
false
true
0.973138
0.262335
26.233505
0.318064
5.918401
0.034743
3.47432
0.24245
0
0.332698
1.920573
0.155253
6.139184
false
false
2024-12-26
2024-12-26
0
JayHyeon/Qwen-0.5B-eDPO-1epoch
JayHyeon_Qwen-0.5B-eDPO-5epoch_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen-0.5B-eDPO-5epoch" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen-0.5B-eDPO-5epoch</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen-0.5B-eDPO-5epoch-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen-0.5B-eDPO-5epoch
8de61ddfbe7dc2a00228309af4851797694cd153
6.727795
mit
0
0.494
true
false
false
true
1.446004
0.247747
24.774709
0.309649
5.197838
0.023414
2.34139
0.249161
0
0.332635
2.246094
0.152261
5.806738
false
false
2024-12-20
2024-12-26
0
JayHyeon/Qwen-0.5B-eDPO-5epoch
JayHyeon_Qwen2.5-0.5B-Instruct-SFT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-Instruct-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-Instruct-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-Instruct-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-Instruct-SFT
2e7122e69e62e72eba6e21b0bc921906402dd5fa
8.158039
mit
1
0.63
true
false
false
true
1.009092
0.276773
27.677341
0.32537
5.93242
0.039275
3.927492
0.282718
4.362416
0.334156
1.269531
0.152011
5.779034
false
false
2024-12-26
2024-12-26
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-Instruct-SFT-DPO-1epoch_v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-Instruct-SFT-DPO-1epoch_v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-Instruct-SFT-DPO-1epoch_v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-Instruct-SFT-DPO-1epoch_v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-Instruct-SFT-DPO-1epoch_v1
2736ff4f329f204d91dd47b8bc951945b7ccc572
8.150785
mit
0
0.494
true
false
false
true
1.004167
0.246873
24.687274
0.326031
6.126073
0.064955
6.495468
0.272651
3.020134
0.343365
2.18724
0.157497
6.38852
false
false
2024-12-26
2024-12-26
0
JayHyeon/Qwen2.5-0.5B-Instruct-SFT-DPO-1epoch_v1
JayHyeon_Qwen2.5-0.5B-Instruct-SFT-IRPO-1epoch_v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-Instruct-SFT-IRPO-1epoch_v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-Instruct-SFT-IRPO-1epoch_v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-Instruct-SFT-IRPO-1epoch_v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-Instruct-SFT-IRPO-1epoch_v1
c469240bdc78d707215b4e58d12a72c7b75abfb3
8.319017
mit
0
0.494
true
false
false
true
1.001503
0.260586
26.058636
0.330803
6.622365
0.049849
4.984894
0.280201
4.026846
0.328823
1.269531
0.162566
6.951832
false
false
2024-12-26
2024-12-26
0
JayHyeon/Qwen2.5-0.5B-Instruct-SFT-IRPO-1epoch_v1
JayHyeon_Qwen2.5-0.5B-Instruct-SFT-MDPO-1epoch_v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-Instruct-SFT-MDPO-1epoch_v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-Instruct-SFT-MDPO-1epoch_v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-Instruct-SFT-MDPO-1epoch_v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-Instruct-SFT-MDPO-1epoch_v1
77f99c1b2a2d32c84d0cd986eb952927c3b77497
7.89622
mit
0
0.494
true
false
false
true
0.995978
0.252918
25.291781
0.326195
6.129987
0.056647
5.664653
0.268456
2.46085
0.330125
1.432292
0.15758
6.397754
false
false
2024-12-26
2024-12-26
0
JayHyeon/Qwen2.5-0.5B-Instruct-SFT-MDPO-1epoch_v1
JayHyeon_Qwen2.5-0.5B-SFT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT
804606871a044917289c9ea22d335a80a0708cb6
6.611059
mit
0
0.63
true
false
false
false
1.727674
0.196365
19.636453
0.312075
4.434475
0.02719
2.719033
0.278523
3.803132
0.339427
1.595052
0.167304
7.478206
false
false
2024-12-27
2024-12-27
1
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-SFT-1e-4_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-1e-4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-1e-4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-1e-4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-1e-4
2303839a4f4f3a1ada55113c451177ad481eb647
5.941931
mit
0
0.63
true
false
false
false
1.714999
0.20196
20.195969
0.301709
4.331493
0.018882
1.888218
0.250839
0.111857
0.344635
2.246094
0.161902
6.877955
false
false
2024-12-28
2024-12-28
1
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-SFT-1e-4-2ep_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-1e-4-2ep" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-1e-4-2ep</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-1e-4-2ep-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-1e-4-2ep
b183a552d7e26053a6fb0ee01191835d7735b80d
6.345325
0
0.63
false
false
false
false
1.580355
0.21405
21.404983
0.317223
5.650884
0.026435
2.643505
0.246644
0
0.347271
2.408854
0.153674
5.963726
false
false
2024-12-29
2024-12-29
1
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-SFT-1e-4-3ep_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-1e-4-3ep" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-1e-4-3ep</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-1e-4-3ep-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-1e-4-3ep
c8eb2639aeba964754425b9e22fda656fdfb9f06
6.469033
mit
0
0.63
true
false
false
false
1.420019
0.22574
22.573993
0.306426
4.797235
0.026435
2.643505
0.248322
0
0.366062
2.891146
0.153175
5.908319
false
false
2024-12-29
2024-12-29
1
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-SFT-1e-4-5ep_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-1e-4-5ep" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-1e-4-5ep</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-1e-4-5ep-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-1e-4-5ep
3baf97b78f186e2a2bfadb77a894cc12642709a8
5.890344
0
0.63
false
false
false
false
1.26795
0.198687
19.868726
0.310447
4.784241
0.019637
1.963746
0.253356
0.447427
0.340667
2.083333
0.155751
6.194592
false
false
2024-12-29
2024-12-29
1
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-SFT-1e-5_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-1e-5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-1e-5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-1e-5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-1e-5
7a00c50afd13d3020b200733e87309ea81126501
6.664257
mit
0
0.63
true
false
false
false
1.716113
0.198588
19.858753
0.313986
4.213684
0.037764
3.776435
0.268456
2.46085
0.346031
1.920573
0.169797
7.755245
false
false
2024-12-28
2024-12-28
1
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-SFT-1e-5-2ep_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-1e-5-2ep" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-1e-5-2ep</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-1e-5-2ep-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-1e-5-2ep
27b7325a3e7b629ded08040ba017a6c63e3be68a
7.003092
0
0.63
false
false
false
false
1.694762
0.197064
19.706379
0.32247
5.619297
0.05287
5.287009
0.269295
2.572707
0.33676
1.595052
0.165143
7.238106
false
false
2024-12-29
2024-12-29
1
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-SFT-1e-5-3ep_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-1e-5-3ep" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-1e-5-3ep</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-1e-5-3ep-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-1e-5-3ep
869cde816bd94004f54e73447291eb4eb9da832b
7.74034
mit
0
0.63
true
false
false
false
1.651995
0.224116
22.411646
0.324681
6.246296
0.053625
5.362538
0.270134
2.684564
0.335333
2.083333
0.168883
7.653664
false
false
2024-12-28
2024-12-29
1
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-SFT-1e-5-5ep_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-1e-5-5ep" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-1e-5-5ep</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-1e-5-5ep-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-1e-5-5ep
c91713cc1c0a0d316f467ae16010fdde6dc9f267
7.889005
0
0.63
false
false
false
false
1.641911
0.229187
22.918744
0.325934
6.537614
0.052115
5.21148
0.279362
3.914989
0.323521
1.106771
0.1688
7.64443
false
false
2024-12-29
2024-12-29
1
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-SFT-2e-4_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-4
e6754bdc0fbeb7fc8d0df3c3677c0f70f9c4d3a8
5.555176
mit
0
0.63
true
false
false
false
1.707137
0.203434
20.343356
0.293555
3.114588
0.024169
2.416918
0.25755
1.006711
0.343427
1.861719
0.14129
4.587766
false
false
2024-12-28
2024-12-28
1
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-SFT-2e-4-2ep_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-4-2ep" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-4-2ep</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-4-2ep-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-4-2ep
5a8ef93b6c4867dc95656b3c49f45fc26207444c
5.518302
0
0.63
false
false
false
false
1.711045
0.183075
18.307535
0.298396
3.199499
0.024924
2.492447
0.24245
0
0.35676
3.728385
0.148438
5.381944
false
false
2024-12-29
2024-12-29
1
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-SFT-2e-4-3ep_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-4-3ep" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-4-3ep</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-4-3ep-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-4-3ep
7c5ba3070e1f395881226ab15cfad8af17522c52
5.465283
mit
0
0.63
true
false
false
false
1.581723
0.198962
19.896209
0.310988
4.420916
0.015106
1.510574
0.260906
1.454139
0.344948
0.885156
0.141622
4.624704
false
false
2024-12-29
2024-12-29
1
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-SFT-2e-4-5ep_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-4-5ep" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-4-5ep</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-4-5ep-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-4-5ep
ed756a6854f9ccf5dfa119f55ce74a7c791b0868
6.087119
0
0.63
false
false
false
false
1.640625
0.18972
18.971994
0.293642
3.06933
0.018127
1.812689
0.269295
2.572707
0.387396
6.357813
0.133644
3.73818
false
false
2024-12-29
2024-12-29
1
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-SFT-2e-5_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-5
59b7674144f30acea5e4470e29cc4d59b48d5e8e
6.96911
mit
0
0.63
true
false
false
false
1.684916
0.206756
20.675585
0.320397
4.981848
0.037009
3.700906
0.269295
2.572707
0.348667
2.35
0.167803
7.533614
false
false
2024-12-28
2024-12-28
1
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-5-2ep-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep
f4688fa593ea40689cc53d0683a2bb03262b0cd9
7.692368
0
0.63
false
false
false
false
1.74107
0.220145
22.014477
0.32172
5.764802
0.040785
4.07855
0.277685
3.691275
0.336698
2.720573
0.170961
7.88453
false
false
2024-12-29
2024-12-29
1
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_3e-7-3ep_0alp_5lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_3e-7-3ep_0alp_5lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_3e-7-3ep_0alp_5lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_3e-7-3ep_0alp_5lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_3e-7-3ep_0alp_5lam
c40b7687fb6eb50a8e7dbc85ed5a37d9566cef55
7.460129
0
0.63
false
false
false
true
1.680269
0.241053
24.105263
0.316718
6.70248
0.034743
3.47432
0.270973
2.796421
0.330125
1.432292
0.15625
6.25
false
false
2025-01-06
2025-01-07
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-6-1ep_0alp_5lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-6-1ep_0alp_5lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-6-1ep_0alp_5lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-6-1ep_0alp_5lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-6-1ep_0alp_5lam
d581f58e13931b7b21b6237a6277fbe130bec706
7.891701
0
0.494
false
false
false
true
0.962901
0.236856
23.685599
0.326004
6.89211
0.045317
4.531722
0.276007
3.467562
0.335521
2.440104
0.156998
6.333112
false
false
2025-01-03
2025-01-03
0
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-6-1ep_0alp_5lam
JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-6-2ep_0alp_5lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-6-2ep_0alp_5lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-6-2ep_0alp_5lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-6-2ep_0alp_5lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-6-2ep_0alp_5lam
cbb9bedaa4b57478682a4e96e2807fede7a85a39
7.618464
0
0.494
false
false
false
true
0.96645
0.22624
22.623971
0.326154
7.351407
0.034743
3.47432
0.279362
3.914989
0.340823
2.336198
0.154089
6.0099
false
false
2025-01-03
2025-01-03
0
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-6-2ep_0alp_5lam
JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-6-3ep_0alp_5lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-6-3ep_0alp_5lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-6-3ep_0alp_5lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-6-3ep_0alp_5lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-6-3ep_0alp_5lam
02d1f2ab9340abc3592d8b8e5ed654b31d0c314c
7.868643
0
0.63
false
false
false
true
1.673511
0.250795
25.079456
0.319933
6.453776
0.040785
4.07855
0.276007
3.467562
0.335458
1.965625
0.155502
6.166888
false
false
2025-01-02
2025-01-03
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-7-1ep_0alp_5lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-7-1ep_0alp_5lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-7-1ep_0alp_5lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-7-1ep_0alp_5lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-7-1ep_0alp_5lam
c064c510a2b5723caaa43cdae0fda827f7d671d6
7.430132
0
0.494
false
false
false
true
0.94996
0.238979
23.897924
0.31816
6.676264
0.04003
4.003021
0.267617
2.348993
0.332792
1.432292
0.156001
6.222296
false
false
2025-01-02
2025-01-02
0
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-7-1ep_0alp_5lam
JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-7-2ep_0alp_5lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-7-2ep_0alp_5lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-7-2ep_0alp_5lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-7-2ep_0alp_5lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-7-2ep_0alp_5lam
066449b81749965ba3713217c5f924a1f96d19c9
7.336427
0
0.494
false
false
false
true
0.953034
0.242302
24.230154
0.315408
6.449025
0.034743
3.47432
0.267617
2.348993
0.332792
1.432292
0.154754
6.083777
false
false
2025-01-02
2025-01-02
0
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-7-2ep_0alp_5lam
JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-7-3ep_0alp_5lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-7-3ep_0alp_5lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-7-3ep_0alp_5lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-7-3ep_0alp_5lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPOP_5e-7-3ep_0alp_5lam
c89911f8d376b2504dc7ef337d366ff13fd6fbc1
7.64847
0
0.63
false
false
false
true
1.619905
0.249321
24.932069
0.318972
6.900839
0.043807
4.380665
0.265101
2.013423
0.334125
1.432292
0.156084
6.231531
false
false
2025-01-02
2025-01-02
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-6-1ep_0alp_0lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-6-1ep_0alp_0lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-6-1ep_0alp_0lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-6-1ep_0alp_0lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-6-1ep_0alp_0lam
f0ec8638c05222a9f5b66ba4a283dc6535866cef
7.714071
0
0.494
false
false
false
true
0.933588
0.254167
25.416672
0.316719
6.330229
0.040785
4.07855
0.271812
2.908277
0.328854
1.106771
0.157995
6.443927
false
false
2025-01-02
2025-01-02
0
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-6-1ep_0alp_0lam
JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-6-2ep_0alp_0lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-6-2ep_0alp_0lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-6-2ep_0alp_0lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-6-2ep_0alp_0lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-6-2ep_0alp_0lam
e5f701dc4c9011ac425ac3d8f43f0a0c5fc07485
7.634084
0
0.494
false
false
false
true
0.982622
0.245074
24.507418
0.315953
6.636387
0.040785
4.07855
0.274329
3.243848
0.330188
1.106771
0.156084
6.231531
false
false
2025-01-02
2025-01-02
0
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-6-2ep_0alp_0lam
JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-6-3ep_0alp_0lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-6-3ep_0alp_0lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-6-3ep_0alp_0lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-6-3ep_0alp_0lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-6-3ep_0alp_0lam
297d485cf519619f394f13f1cb4c18bb724fa587
7.77225
0
0.63
false
false
false
true
1.58521
0.25574
25.574032
0.314198
6.317309
0.04003
4.003021
0.274329
3.243848
0.331521
1.106771
0.157497
6.38852
false
false
2024-12-31
2025-01-02
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-7-2ep_0alp_0lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-7-2ep_0alp_0lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-7-2ep_0alp_0lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-7-2ep_0alp_0lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-7-2ep_0alp_0lam
338a87eaf84f8f843abfc765fa761f4aadfa650d
7.773024
0
0.494
false
false
false
true
0.95531
0.260536
26.053649
0.316697
6.435274
0.036254
3.625378
0.270134
2.684564
0.334125
1.432292
0.157663
6.406989
false
false
2025-01-06
2025-01-07
0
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-7-2ep_0alp_0lam
JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-7-3ep_0alp_0lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-7-3ep_0alp_0lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-7-3ep_0alp_0lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-7-3ep_0alp_0lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_1e-7-3ep_0alp_0lam
dd17624113765a095ddb8908cb145f2b186e92d0
7.569906
0
0.63
false
false
false
true
1.730292
0.257814
25.781371
0.31732
6.385351
0.035498
3.549849
0.263423
1.789709
0.328792
1.432292
0.158328
6.480866
false
false
2025-01-05
2025-01-07
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_2e-6-1ep_0alp_0lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_2e-6-1ep_0alp_0lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_2e-6-1ep_0alp_0lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_2e-6-1ep_0alp_0lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_2e-6-1ep_0alp_0lam
4f0af6566d027fd47391c5ae107634ac53b4f37a
7.393492
0
0.494
false
false
false
true
0.939187
0.233534
23.353369
0.319762
6.402745
0.03852
3.851964
0.275168
3.355705
0.327552
0.94401
0.158078
6.453162
false
false
2025-01-02
2025-01-02
0
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_2e-6-1ep_0alp_0lam
JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_2e-6-2ep_0alp_0lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_2e-6-2ep_0alp_0lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_2e-6-2ep_0alp_0lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_2e-6-2ep_0alp_0lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_2e-6-2ep_0alp_0lam
7e9475946e5e36c20e5e16ddbcda8a1ed66d68dd
7.844418
0
0.494
false
false
false
true
0.953861
0.247197
24.719744
0.322627
6.901808
0.050604
5.060423
0.276007
3.467562
0.326219
0.94401
0.153757
5.972961
false
false
2025-01-02
2025-01-02
0
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_2e-6-2ep_0alp_0lam
JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_2e-6-3ep_0alp_0lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_2e-6-3ep_0alp_0lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_2e-6-3ep_0alp_0lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_2e-6-3ep_0alp_0lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_2e-6-3ep_0alp_0lam
c9dd97520558d2f9c363d8088ca940b2a22e78d5
7.696682
0
0.63
false
false
false
true
1.617976
0.247422
24.742239
0.322912
7.002679
0.041541
4.154079
0.272651
3.020134
0.32749
1.269531
0.153923
5.99143
false
false
2025-01-01
2025-01-02
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_3e-6-1ep_0alp_0lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_3e-6-1ep_0alp_0lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_3e-6-1ep_0alp_0lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_3e-6-1ep_0alp_0lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_3e-6-1ep_0alp_0lam
105abeccbc545d8b3400a1f62c33196411580777
7.753029
0
0.494
false
false
false
true
0.950373
0.240278
24.027802
0.324537
6.620615
0.043051
4.305136
0.281879
4.250559
0.326219
0.94401
0.15733
6.37005
false
false
2025-01-02
2025-01-02
0
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_3e-6-1ep_0alp_0lam
JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_3e-6-2ep_0alp_0lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_3e-6-2ep_0alp_0lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_3e-6-2ep_0alp_0lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_3e-6-2ep_0alp_0lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_3e-6-2ep_0alp_0lam
1aa0cee350e6029e0d1f1af2337625e241ffa794
7.552425
0
0.494
false
false
false
true
0.983856
0.236806
23.680612
0.322429
6.780448
0.046073
4.607251
0.274329
3.243848
0.33549
1.269531
0.151596
5.732861
false
false
2025-01-02
2025-01-02
0
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_3e-6-2ep_0alp_0lam
JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_3e-6-3ep_0alp_0lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_3e-6-3ep_0alp_0lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_3e-6-3ep_0alp_0lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_3e-6-3ep_0alp_0lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_3e-6-3ep_0alp_0lam
b50ee3302d29ee2fe2a0eba12b14b425ddd8f8af
7.645766
0
0.63
false
false
false
true
1.599665
0.237181
23.718068
0.324771
7.007126
0.047583
4.758308
0.270134
2.684564
0.339427
1.595052
0.155003
6.11148
false
false
2025-01-01
2025-01-02
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_3e-7-3ep_0alp_0lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_3e-7-3ep_0alp_0lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_3e-7-3ep_0alp_0lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_3e-7-3ep_0alp_0lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_3e-7-3ep_0alp_0lam
5dada6736554e6198d817258e32db4ba67598c6a
7.58004
0
0.63
false
false
false
true
1.727522
0.24992
24.992021
0.31806
6.671899
0.041541
4.154079
0.265101
2.013423
0.328823
1.269531
0.157414
6.379285
false
false
2025-01-06
2025-01-07
2
Qwen/Qwen2.5-0.5B
JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-6-1ep_0alp_0lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-6-1ep_0alp_0lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-6-1ep_0alp_0lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-6-1ep_0alp_0lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-6-1ep_0alp_0lam
b82b883ed0c5ad5cc1156f69343b0a73e48216f0
7.682978
0
0.494
false
false
false
true
0.924426
0.238105
23.81049
0.324218
6.428288
0.049849
4.984894
0.274329
3.243848
0.332823
1.269531
0.157247
6.360816
false
false
2025-01-02
2025-01-02
0
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-6-1ep_0alp_0lam
JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-6-2ep_0alp_0lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2Model
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-6-2ep_0alp_0lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-6-2ep_0alp_0lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-6-2ep_0alp_0lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-6-2ep_0alp_0lam
3506c4972cb7966146afcd51f7bfc85a6ad1af4a
7.668495
0
0.494
false
false
false
true
1.438821
0.242077
24.207658
0.32248
6.992687
0.04003
4.003021
0.280201
4.026846
0.340823
1.269531
0.149601
5.511229
false
false
2025-01-02
2025-01-02
0
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-6-2ep_0alp_0lam
JayHyeon_Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-6-3ep_0alp_0lam_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-6-3ep_0alp_0lam" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-6-3ep_0alp_0lam</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/JayHyeon__Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-6-3ep_0alp_0lam-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
JayHyeon/Qwen2.5-0.5B-SFT-2e-5-2ep-DPO_5e-6-3ep_0alp_0lam
6dbdfc7167e8066b907a67bf271a3ece465c9126
7.698047
0
0.63
false
false
false
true
1.568427
0.238055
23.805503
0.32652
7.221133
0.044562
4.456193
0.276007
3.467562
0.340792
1.698958
0.14985
5.538933
false
false
2025-01-01
2025-01-03
2
Qwen/Qwen2.5-0.5B