eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
64 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.09k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.22
0.83
BBH
float64
0.25
76.7
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.7
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
525 values
Submission Date
stringclasses
263 values
Generation
int64
0
10
Base Model
stringlengths
4
102
DavidAU_Gemma-The-Writer-N-Restless-Quill-10B-Uncensored_float16
float16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/Gemma-The-Writer-N-Restless-Quill-10B-Uncensored" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/Gemma-The-Writer-N-Restless-Quill-10B-Uncensored</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__Gemma-The-Writer-N-Restless-Quill-10B-Uncensored-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/Gemma-The-Writer-N-Restless-Quill-10B-Uncensored
1138d6b3e3527b75e7331044b1f0589a90667e8d
31.679346
3
10.034
false
false
false
true
3.493397
0.707093
70.709274
0.592229
40.850091
0.229607
22.960725
0.341443
12.192394
0.416323
10.407031
0.396609
32.95656
false
false
2024-10-30
2025-01-11
1
DavidAU/Gemma-The-Writer-N-Restless-Quill-10B-Uncensored (Merge)
DavidAU_L3-DARKEST-PLANET-16.5B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3-DARKEST-PLANET-16.5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3-DARKEST-PLANET-16.5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3-DARKEST-PLANET-16.5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3-DARKEST-PLANET-16.5B
37545fbc229061956c1801968c33c5b187512c41
24.265056
5
16.537
false
false
false
true
4.225041
0.623062
62.306236
0.523044
31.776241
0.089879
8.987915
0.295302
6.040268
0.375365
7.253906
0.363032
29.225768
false
false
2024-10-11
2025-01-11
1
DavidAU/L3-DARKEST-PLANET-16.5B (Merge)
DavidAU_L3-Dark-Planet-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3-Dark-Planet-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3-Dark-Planet-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3-Dark-Planet-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3-Dark-Planet-8B
462c9307ba4cfcb0c1edcceac5e06f4007bc803e
20.469184
6
8.03
false
false
false
false
1.878281
0.413411
41.341086
0.508408
29.789627
0.082326
8.232628
0.300336
6.711409
0.361594
6.332552
0.37367
30.407801
false
false
2024-09-05
2024-09-12
1
DavidAU/L3-Dark-Planet-8B (Merge)
DavidAU_L3-Jamet-12.2B-MK.V-Blackroot-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3-Jamet-12.2B-MK.V-Blackroot-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3-Jamet-12.2B-MK.V-Blackroot-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3-Jamet-12.2B-MK.V-Blackroot-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3-Jamet-12.2B-MK.V-Blackroot-Instruct
db4ae3d7b608fd0e7490d2fcfa0436e56e21af33
17.857043
0
12.174
false
false
false
false
1.437522
0.3962
39.619986
0.476572
25.869793
0.040785
4.07855
0.278523
3.803132
0.401969
8.31276
0.329122
25.458038
false
false
2024-08-23
2024-09-04
1
DavidAU/L3-Jamet-12.2B-MK.V-Blackroot-Instruct (Merge)
DavidAU_L3-Lumimaid-12.2B-v0.1-OAS-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3-Lumimaid-12.2B-v0.1-OAS-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3-Lumimaid-12.2B-v0.1-OAS-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3-Lumimaid-12.2B-v0.1-OAS-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3-Lumimaid-12.2B-v0.1-OAS-Instruct
65a9e957dc4211aa3d87fdf588767823af5cde3f
17.831556
1
12.174
false
false
false
false
2.849414
0.392403
39.240327
0.469302
24.504816
0.046073
4.607251
0.276846
3.579418
0.419427
11.261719
0.314162
23.795804
false
false
2024-08-24
2024-09-12
1
DavidAU/L3-Lumimaid-12.2B-v0.1-OAS-Instruct (Merge)
DavidAU_L3-SMB-Instruct-12.2B-F32_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3-SMB-Instruct-12.2B-F32" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3-SMB-Instruct-12.2B-F32</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3-SMB-Instruct-12.2B-F32-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3-SMB-Instruct-12.2B-F32
ac5e205a41b17a7b05b1b62f352aacc7e65b2f13
18.901639
1
12.174
false
false
false
false
2.764794
0.430322
43.032155
0.478641
26.130957
0.046828
4.682779
0.281879
4.250559
0.408729
9.624479
0.3312
25.688904
false
false
2024-08-25
2024-09-12
1
DavidAU/L3-SMB-Instruct-12.2B-F32 (Merge)
DavidAU_L3-Stheno-Maid-Blackroot-Grand-HORROR-16B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3-Stheno-Maid-Blackroot-Grand-HORROR-16B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3-Stheno-Maid-Blackroot-Grand-HORROR-16B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3-Stheno-Maid-Blackroot-Grand-HORROR-16B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3-Stheno-Maid-Blackroot-Grand-HORROR-16B
7b626e50b6c35fcb064b8b039fcf30eae01c3fae
17.197491
0
16.537
false
false
false
false
5.845597
0.343893
34.389309
0.473633
26.692021
0.021903
2.190332
0.270973
2.796421
0.403115
8.55599
0.357048
28.560875
false
false
2024-08-23
2024-09-04
1
DavidAU/L3-Stheno-Maid-Blackroot-Grand-HORROR-16B (Merge)
DavidAU_L3-Stheno-v3.2-12.2B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3-Stheno-v3.2-12.2B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3-Stheno-v3.2-12.2B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3-Stheno-v3.2-12.2B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3-Stheno-v3.2-12.2B-Instruct
8271fc32a601a4fa5efbe58c41a0ef4181ad8836
18.73968
1
12.174
false
false
false
false
2.795399
0.402795
40.279459
0.484598
27.369623
0.050604
5.060423
0.275168
3.355705
0.41025
10.314583
0.334525
26.058289
false
false
2024-08-24
2024-09-12
1
DavidAU/L3-Stheno-v3.2-12.2B-Instruct (Merge)
DavidAU_L3.1-Dark-Planet-SpinFire-Uncensored-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3.1-Dark-Planet-SpinFire-Uncensored-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3.1-Dark-Planet-SpinFire-Uncensored-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3.1-Dark-Planet-SpinFire-Uncensored-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3.1-Dark-Planet-SpinFire-Uncensored-8B
9e4ae1310a0d2c82d50fe2aedc94ef084901ac48
24.710302
4
8.03
false
false
false
true
1.260564
0.70427
70.427023
0.526091
32.461783
0.0929
9.29003
0.279362
3.914989
0.354125
2.498958
0.367021
29.669031
false
false
2024-11-10
2025-01-11
1
DavidAU/L3.1-Dark-Planet-SpinFire-Uncensored-8B (Merge)
DavidAU_L3.1-MOE-2X8B-Deepseek-DeepHermes-e32-uncensored-abliterated-13.7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3.1-MOE-2X8B-Deepseek-DeepHermes-e32-uncensored-abliterated-13.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3.1-MOE-2X8B-Deepseek-DeepHermes-e32-uncensored-abliterated-13.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3.1-MOE-2X8B-Deepseek-DeepHermes-e32-uncensored-abliterated-13.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3.1-MOE-2X8B-Deepseek-DeepHermes-e32-uncensored-abliterated-13.7B
1ed5318f6bf5461efa5168289ab6786f4987ca96
19.615401
0
13.668
false
true
false
false
1.403573
0.334526
33.452573
0.442082
21.197829
0.260574
26.057402
0.313758
8.501119
0.374865
7.458073
0.289229
21.025414
false
false
2025-03-06
2025-03-10
1
DavidAU/L3.1-MOE-2X8B-Deepseek-DeepHermes-e32-uncensored-abliterated-13.7B (Merge)
DavidAU_Qwen2.5-MOE-2X1.5B-DeepSeek-Uncensored-Censored-4B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2MoeForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/Qwen2.5-MOE-2X1.5B-DeepSeek-Uncensored-Censored-4B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/Qwen2.5-MOE-2X1.5B-DeepSeek-Uncensored-Censored-4B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__Qwen2.5-MOE-2X1.5B-DeepSeek-Uncensored-Censored-4B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/Qwen2.5-MOE-2X1.5B-DeepSeek-Uncensored-Censored-4B
412099bb6f570707a3a6ee311bfeb93a204c1b7b
5.120377
0
4.089
false
true
false
false
1.126877
0.178329
17.832906
0.303261
3.023581
0.024924
2.492447
0.259228
1.230425
0.371458
4.565625
0.114195
1.577275
false
false
2025-03-06
2025-03-10
1
DavidAU/Qwen2.5-MOE-2X1.5B-DeepSeek-Uncensored-Censored-4B (Merge)
DavidAU_Qwen2.5-MOE-2X7B-DeepSeek-Abliterated-Censored-19B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2MoeForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/Qwen2.5-MOE-2X7B-DeepSeek-Abliterated-Censored-19B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/Qwen2.5-MOE-2X7B-DeepSeek-Abliterated-Censored-19B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__Qwen2.5-MOE-2X7B-DeepSeek-Abliterated-Censored-19B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/Qwen2.5-MOE-2X7B-DeepSeek-Abliterated-Censored-19B
8a908eadce5fc13ddedab2c854433245de430e41
13.021728
0
19.022
false
true
false
false
2.259668
0.283518
28.351773
0.359227
10.870199
0.241692
24.169184
0.265101
2.013423
0.384698
5.653906
0.163647
7.071882
false
false
2025-03-06
2025-03-10
1
DavidAU/Qwen2.5-MOE-2X7B-DeepSeek-Abliterated-Censored-19B (Merge)
DavidAU_Qwen2.5-MOE-6x1.5B-DeepSeek-Reasoning-e32_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2MoeForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/Qwen2.5-MOE-6x1.5B-DeepSeek-Reasoning-e32" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/Qwen2.5-MOE-6x1.5B-DeepSeek-Reasoning-e32</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__Qwen2.5-MOE-6x1.5B-DeepSeek-Reasoning-e32-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/Qwen2.5-MOE-6x1.5B-DeepSeek-Reasoning-e32
7a447e7af3ce9ea515f25c04ab7f942fe637b521
6.418593
0
8.714
false
true
false
false
2.252699
0.210678
21.067767
0.328618
6.218965
0.066465
6.646526
0.247483
0
0.340448
3.222656
0.112201
1.355644
false
false
2025-03-05
2025-03-10
1
DavidAU/Qwen2.5-MOE-6x1.5B-DeepSeek-Reasoning-e32 (Merge)
Davidsv_SUONG-1_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Davidsv/SUONG-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Davidsv/SUONG-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Davidsv__SUONG-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Davidsv/SUONG-1
5bab856eaa8836d4f37d736926bdd18b97ac3241
5.322342
1
2.879
false
false
false
false
0.22068
0.249721
24.972074
0.281713
1.827242
0
0
0.244128
0
0.35775
4.185417
0.108544
0.94932
false
false
2025-02-14
2025-02-14
1
Davidsv/SUONG-1 (Merge)
DavieLion_Llama-3.2-1B-SPIN-iter0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavieLion/Llama-3.2-1B-SPIN-iter0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavieLion/Llama-3.2-1B-SPIN-iter0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavieLion__Llama-3.2-1B-SPIN-iter0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavieLion/Llama-3.2-1B-SPIN-iter0
bc1a37920fb5e3cb64a71a4deda649f33fecb95d
3.623817
llama3.2
0
1.236
true
false
false
false
0.36965
0.150677
15.067687
0.293008
2.100828
0
0
0.253356
0.447427
0.356542
2.734375
0.112533
1.392583
false
false
2024-12-27
2024-12-29
1
DavieLion/Llama-3.2-1B-SPIN-iter0 (Merge)
DavieLion_Llama-3.2-1B-SPIN-iter0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavieLion/Llama-3.2-1B-SPIN-iter0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavieLion/Llama-3.2-1B-SPIN-iter0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavieLion__Llama-3.2-1B-SPIN-iter0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavieLion/Llama-3.2-1B-SPIN-iter0
2c95189201f94c64fcf4c9a7edc4777741f18999
3.985688
llama3.2
0
1.236
true
false
false
false
0.708953
0.154923
15.492338
0.293726
2.330669
0.006042
0.60423
0.25755
1.006711
0.356479
3.059896
0.112783
1.420287
false
false
2024-12-27
2024-12-29
1
DavieLion/Llama-3.2-1B-SPIN-iter0 (Merge)
DavieLion_Llama-3.2-1B-SPIN-iter1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavieLion/Llama-3.2-1B-SPIN-iter1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavieLion/Llama-3.2-1B-SPIN-iter1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavieLion__Llama-3.2-1B-SPIN-iter1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavieLion/Llama-3.2-1B-SPIN-iter1
8c632ae68bd385af2e2270933326edbcd0044e8c
3.751975
llama3.2
0
1.236
true
false
false
false
0.72172
0.157546
15.754642
0.294025
2.433772
0.002266
0.226586
0.250839
0.111857
0.364604
2.675521
0.111785
1.309471
false
false
2024-12-29
2024-12-29
1
DavieLion/Llama-3.2-1B-SPIN-iter1 (Merge)
DavieLion_Llama-3.2-1B-SPIN-iter2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavieLion/Llama-3.2-1B-SPIN-iter2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavieLion/Llama-3.2-1B-SPIN-iter2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavieLion__Llama-3.2-1B-SPIN-iter2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavieLion/Llama-3.2-1B-SPIN-iter2
36c9b3fd7196c6bac0fbe8f1e9c4f4fb3bcc993a
3.658146
llama3.2
0
1.236
true
false
false
false
0.707076
0.137613
13.761265
0.298034
3.157343
0.005287
0.528701
0.254195
0.559284
0.355302
2.51276
0.112866
1.429521
false
false
2024-12-29
2024-12-29
1
DavieLion/Llama-3.2-1B-SPIN-iter2 (Merge)
DavieLion_Llama-3.2-1B-SPIN-iter3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavieLion/Llama-3.2-1B-SPIN-iter3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavieLion/Llama-3.2-1B-SPIN-iter3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavieLion__Llama-3.2-1B-SPIN-iter3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavieLion/Llama-3.2-1B-SPIN-iter3
108557f0db9b6f7c35ba8b0d094ebd81be6fe9fd
3.593141
llama3.2
0
1.236
true
false
false
false
1.093083
0.133591
13.359109
0.297523
3.139502
0.006798
0.679758
0.253356
0.447427
0.349969
2.51276
0.112783
1.420287
false
false
2024-12-29
2024-12-29
1
DavieLion/Llama-3.2-1B-SPIN-iter3 (Merge)
DavieLion_Llama-3.2-1B-SPIN-iter3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavieLion/Llama-3.2-1B-SPIN-iter3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavieLion/Llama-3.2-1B-SPIN-iter3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavieLion__Llama-3.2-1B-SPIN-iter3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavieLion/Llama-3.2-1B-SPIN-iter3
ae511fd6bae53efd2656dd3cc6fc87d0fc56356c
3.61369
llama3.2
0
1.236
true
false
false
false
0.36323
0.132392
13.239205
0.297224
3.028514
0
0
0.264262
1.901566
0.352667
2.083333
0.112866
1.429521
false
false
2024-12-29
2024-12-29
1
DavieLion/Llama-3.2-1B-SPIN-iter3 (Merge)
DavieLion_Lllma-3.2-1B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavieLion/Lllma-3.2-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavieLion/Lllma-3.2-1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavieLion__Lllma-3.2-1B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavieLion/Lllma-3.2-1B
5e0d3bc7ca705a41f897a870efd4ff6ce455e20c
3.932332
llama3.2
0
1.236
true
false
false
false
0.73375
0.160144
16.014397
0.296469
2.438123
0.006798
0.679758
0.244128
0
0.357813
3.059896
0.112616
1.401817
false
false
2024-12-27
2024-12-27
0
DavieLion/Lllma-3.2-1B
DebateLabKIT_Llama-3.1-Argunaut-1-8B-SFT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DebateLabKIT/Llama-3.1-Argunaut-1-8B-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DebateLabKIT/Llama-3.1-Argunaut-1-8B-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DebateLabKIT__Llama-3.1-Argunaut-1-8B-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DebateLabKIT/Llama-3.1-Argunaut-1-8B-SFT
e9d7396bc0fa3d1ff4c1f4b1a0d81a1d1a7e977c
24.113556
llama3.1
6
8.03
true
false
false
true
1.433957
0.551921
55.192112
0.482383
27.187827
0.145015
14.501511
0.283557
4.474273
0.450302
15.854427
0.347241
27.471188
false
false
2024-12-31
2025-01-02
1
DebateLabKIT/Llama-3.1-Argunaut-1-8B-SFT (Merge)
Deci_DeciLM-7B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
DeciLMForCausalLM
<a target="_blank" href="https://huggingface.co/Deci/DeciLM-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Deci/DeciLM-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Deci__DeciLM-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Deci/DeciLM-7B
c3c9f4226801dc0433f32aebffe0aac68ee2f051
15.023478
apache-2.0
226
7.044
true
false
false
false
1.284275
0.281295
28.129474
0.442286
21.25273
0.028701
2.870091
0.295302
6.040268
0.435854
13.048438
0.269199
18.799867
false
true
2023-12-10
2024-06-12
0
Deci/DeciLM-7B
Deci_DeciLM-7B-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
DeciLMForCausalLM
<a target="_blank" href="https://huggingface.co/Deci/DeciLM-7B-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Deci/DeciLM-7B-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Deci__DeciLM-7B-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Deci/DeciLM-7B-instruct
4adc7aa9efe61b47b0a98b2cc94527d9c45c3b4f
17.470092
apache-2.0
96
7.044
true
false
false
true
1.277299
0.488024
48.8024
0.458975
23.887149
0.030211
3.021148
0.28943
5.257271
0.388417
5.985417
0.260805
17.867169
false
true
2023-12-10
2024-06-12
0
Deci/DeciLM-7B-instruct
DeepAutoAI_Explore_Llama-3.1-8B-Inst_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/Explore_Llama-3.1-8B-Inst" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/Explore_Llama-3.1-8B-Inst</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__Explore_Llama-3.1-8B-Inst-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/Explore_Llama-3.1-8B-Inst
9752180fafd8f584625eb649c0cba36b91bdc3ce
28.926701
apache-2.0
0
8.03
true
false
false
true
2.639465
0.779483
77.948288
0.511742
30.393263
0.200906
20.090634
0.283557
4.474273
0.390958
9.636458
0.379156
31.017287
false
false
2024-09-21
2024-10-09
1
DeepAutoAI/Explore_Llama-3.1-8B-Inst (Merge)
DeepAutoAI_Explore_Llama-3.2-1B-Inst_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/Explore_Llama-3.2-1B-Inst" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/Explore_Llama-3.2-1B-Inst</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__Explore_Llama-3.2-1B-Inst-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/Explore_Llama-3.2-1B-Inst
9fd790df246b8979c02173f7698819a7805fb04e
13.897377
apache-2.0
0
1.236
true
false
false
true
1.325412
0.564886
56.488561
0.350481
8.292274
0.074773
7.477341
0.255872
0.782998
0.318344
1.359635
0.180851
8.983452
false
false
2024-10-07
2024-10-09
1
DeepAutoAI/Explore_Llama-3.2-1B-Inst (Merge)
DeepAutoAI_Explore_Llama-3.2-1B-Inst_v0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/Explore_Llama-3.2-1B-Inst_v0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/Explore_Llama-3.2-1B-Inst_v0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__Explore_Llama-3.2-1B-Inst_v0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/Explore_Llama-3.2-1B-Inst_v0
9509dee6b01fff1a11dc26cf58d7eecbe3d9d9c4
13.359085
1
1.236
false
false
false
true
0.934379
0.559715
55.971489
0.336509
7.042772
0.059668
5.966767
0.263423
1.789709
0.310313
0.455729
0.180352
8.928044
false
false
2024-10-08
2024-10-08
0
DeepAutoAI/Explore_Llama-3.2-1B-Inst_v0
DeepAutoAI_Explore_Llama-3.2-1B-Inst_v1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__Explore_Llama-3.2-1B-Inst_v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1
3f8b0fb6dcc1e9725ba52dd086241d5d9e413100
10.921434
apache-2.0
0
1.236
true
false
false
true
0.939932
0.499889
49.988918
0.314148
4.25778
0.030967
3.096677
0.244966
0
0.378094
5.195052
0.126912
2.990174
false
false
2024-10-08
2024-10-08
1
DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1 (Merge)
DeepAutoAI_Explore_Llama-3.2-1B-Inst_v1.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__Explore_Llama-3.2-1B-Inst_v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1.1
158b977bca89e073871e2313740a7c75eb1291af
14.311829
apache-2.0
0
1.236
true
false
false
true
1.360524
0.584419
58.441934
0.351266
8.818154
0.071752
7.175227
0.262584
1.677852
0.311708
0.663542
0.181848
9.094267
false
false
2024-10-09
2024-10-17
1
DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1.1 (Merge)
DeepAutoAI_causal_gpt2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/DeepAutoAI/causal_gpt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/causal_gpt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__causal_gpt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/causal_gpt2
995f029f6645dde1ef830406001754b904c49775
6.032059
1
0.124
false
false
false
false
0.25173
0.181277
18.127679
0.302571
2.633344
0.005287
0.528701
0.260067
1.342282
0.426958
12.103125
0.113115
1.457225
false
false
2024-10-17
2024-10-17
0
DeepAutoAI/causal_gpt2
DeepAutoAI_d2nwg_Llama-3.1-8B-Instruct-v0.0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/d2nwg_Llama-3.1-8B-Instruct-v0.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/d2nwg_Llama-3.1-8B-Instruct-v0.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__d2nwg_Llama-3.1-8B-Instruct-v0.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/d2nwg_Llama-3.1-8B-Instruct-v0.0
8bad8800d04a06f3f906728ee223cab2f50453a0
29.338965
0
8.03
false
false
false
true
1.712356
0.789275
78.927468
0.508041
30.510076
0.180514
18.05136
0.291946
5.592841
0.413469
10.983594
0.387716
31.968454
false
false
2024-09-10
2024-09-10
0
DeepAutoAI/d2nwg_Llama-3.1-8B-Instruct-v0.0
DeepAutoAI_d2nwg_causal_gpt2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/DeepAutoAI/d2nwg_causal_gpt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/d2nwg_causal_gpt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__d2nwg_causal_gpt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/d2nwg_causal_gpt2
eab065cba5a7a9b08f8b264d61d504c4ecbb611b
6.305441
0
0.124
false
false
false
false
0.259815
0.191618
19.161824
0.30269
2.850574
0.004532
0.453172
0.25755
1.006711
0.429719
12.68151
0.11511
1.678856
false
false
2024-10-18
2024-10-18
0
DeepAutoAI/d2nwg_causal_gpt2
DeepAutoAI_d2nwg_causal_gpt2_v1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/DeepAutoAI/d2nwg_causal_gpt2_v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/d2nwg_causal_gpt2_v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__d2nwg_causal_gpt2_v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/d2nwg_causal_gpt2_v1
3f40c3dcb3eb591dec80ff03573eec7928a7feaa
6.419566
0
0.124
false
false
false
false
0.343007
0.198862
19.886235
0.29919
2.387278
0.003776
0.377644
0.258389
1.118568
0.433688
13.244271
0.113531
1.503398
false
false
2024-10-18
2024-10-19
0
DeepAutoAI/d2nwg_causal_gpt2_v1
DeepAutoAI_ldm_soup_Llama-3.1-8B-Inst_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/ldm_soup_Llama-3.1-8B-Inst" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/ldm_soup_Llama-3.1-8B-Inst</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__ldm_soup_Llama-3.1-8B-Inst-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/ldm_soup_Llama-3.1-8B-Inst
0f04c5ad830f8ae0828191a4670fd4ba361b63d2
29.859058
apache-2.0
3
8.03
true
false
false
true
2.570061
0.803263
80.326312
0.512117
31.101628
0.188822
18.882175
0.28943
5.257271
0.416135
11.516927
0.38863
32.070035
false
false
2024-09-16
2024-10-09
1
DeepAutoAI/ldm_soup_Llama-3.1-8B-Inst (Merge)
DeepAutoAI_ldm_soup_Llama-3.1-8B-Instruct-v0.0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/ldm_soup_Llama-3.1-8B-Instruct-v0.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/ldm_soup_Llama-3.1-8B-Instruct-v0.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__ldm_soup_Llama-3.1-8B-Instruct-v0.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/ldm_soup_Llama-3.1-8B-Instruct-v0.0
210a97b4dadbda63cc9fe459e8415d4cd3bbaf99
29.735244
0
8.03
false
false
false
true
1.720909
0.78895
78.894999
0.512518
31.162649
0.191843
19.18429
0.291107
5.480984
0.412135
11.516927
0.389545
32.171616
false
false
2024-09-14
2024-09-15
0
DeepAutoAI/ldm_soup_Llama-3.1-8B-Instruct-v0.0
DeepAutoAI_ldm_soup_Llama-3.1-8B-Instruct-v0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/ldm_soup_Llama-3.1-8B-Instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/ldm_soup_Llama-3.1-8B-Instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__ldm_soup_Llama-3.1-8B-Instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/ldm_soup_Llama-3.1-8B-Instruct-v0.1
ecd140c95985b4292c896e25a94a7629d2924ad1
29.735244
0
8.03
false
false
false
true
1.656892
0.78895
78.894999
0.512518
31.162649
0.191843
19.18429
0.291107
5.480984
0.412135
11.516927
0.389545
32.171616
false
false
2024-09-15
2024-09-16
0
DeepAutoAI/ldm_soup_Llama-3.1-8B-Instruct-v0.1
DeepMount00_Lexora-Lite-3B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Lexora-Lite-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Lexora-Lite-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Lexora-Lite-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Lexora-Lite-3B
2cf39db7ecac17edca0bf4e0973b7fb58c40c22c
24.888387
1
3.086
false
false
false
true
3.161923
0.5776
57.759966
0.487339
28.436279
0.230363
23.036254
0.274329
3.243848
0.396604
7.942188
0.360206
28.911791
false
false
2024-09-19
2024-10-20
0
DeepMount00/Lexora-Lite-3B
DeepMount00_Lexora-Lite-3B_v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Lexora-Lite-3B_v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Lexora-Lite-3B_v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Lexora-Lite-3B_v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Lexora-Lite-3B_v2
0562af3800440fe9839bd6e885d9e0062ab70ead
22.690213
1
3.086
false
false
false
true
0.774935
0.494318
49.431841
0.481177
27.168452
0.228097
22.809668
0.270973
2.796421
0.382156
5.669531
0.354388
28.265366
false
false
2024-09-19
2025-02-25
0
DeepMount00/Lexora-Lite-3B_v2
DeepMount00_Lexora-Medium-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Lexora-Medium-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Lexora-Medium-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Lexora-Medium-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Lexora-Medium-7B
c53d166f4f2996a5b7f161529f1ea6548b54a2b2
25.837198
apache-2.0
5
7.616
true
false
false
true
3.469822
0.410338
41.03379
0.514484
32.695331
0.222054
22.205438
0.305369
7.38255
0.443948
14.760156
0.432513
36.945922
false
false
2024-09-24
2024-09-24
0
DeepMount00/Lexora-Medium-7B
DeepMount00_Llama-3-8b-Ita_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Llama-3-8b-Ita" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Llama-3-8b-Ita</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Llama-3-8b-Ita-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Llama-3-8b-Ita
d40847d2981b588690c1dc21d5157d3f4afb2978
26.796816
llama3
24
8.03
true
false
false
true
1.556517
0.75303
75.302974
0.493577
28.077746
0.066465
6.646526
0.305369
7.38255
0.426771
11.679688
0.385223
31.691415
false
false
2024-05-01
2024-06-27
1
meta-llama/Meta-Llama-3-8B
DeepMount00_Llama-3.1-8b-ITA_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Llama-3.1-8b-ITA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Llama-3.1-8b-ITA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Llama-3.1-8b-ITA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Llama-3.1-8b-ITA
5ede1e388b6b15bc06acd364a8f805fe9ed16db9
28.228098
6
8.03
false
false
false
true
2.507574
0.791673
79.167276
0.510936
30.933181
0.108761
10.876133
0.287752
5.033557
0.413594
11.399219
0.387633
31.95922
false
false
2024-08-13
2024-10-28
2
meta-llama/Meta-Llama-3.1-8B
DeepMount00_Llama-3.1-8b-Ita_bfloat16
bfloat16
❓ other
Original
Unknown
<a target="_blank" href="https://huggingface.co/DeepMount00/Llama-3.1-8b-Ita" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Llama-3.1-8b-Ita</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Llama-3.1-8b-Ita-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Llama-3.1-8b-Ita
5ede1e388b6b15bc06acd364a8f805fe9ed16db9
26.265732
6
0
false
false
false
false
0.906247
0.536484
53.648431
0.517
31.333639
0.170695
17.069486
0.306208
7.494407
0.448719
15.15651
0.396027
32.891918
false
false
2024-08-13
2
meta-llama/Meta-Llama-3.1-8B
DeepMount00_Llama-3.1-Distilled_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Llama-3.1-Distilled" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Llama-3.1-Distilled</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Llama-3.1-Distilled-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Llama-3.1-Distilled
0a94c7ddb196107e8bf1b02e31488ff8c17b9eb3
29.631398
llama3
1
8.03
true
false
false
true
1.678
0.784379
78.437878
0.510088
30.841421
0.203172
20.317221
0.303691
7.158837
0.405812
10.126562
0.378158
30.906472
false
false
2024-10-25
2024-10-25
1
meta-llama/Meta-Llama-3-8B
DeepMount00_Qwen2-1.5B-Ita_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Qwen2-1.5B-Ita" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Qwen2-1.5B-Ita</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Qwen2-1.5B-Ita-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Qwen2-1.5B-Ita
26a6671a48c0023293c447932798a3ec72b55a29
16.831761
apache-2.0
21
1.544
true
false
false
true
0.512551
0.51735
51.734952
0.398058
15.422996
0.114048
11.404834
0.262584
1.677852
0.350375
1.063542
0.277178
19.686392
false
false
2024-06-13
2025-02-28
0
DeepMount00/Qwen2-1.5B-Ita
DeepMount00_Qwen2-1.5B-Ita_v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Qwen2-1.5B-Ita_v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Qwen2-1.5B-Ita_v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Qwen2-1.5B-Ita_v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Qwen2-1.5B-Ita_v2
e9c2a4197001bf188e4bc7d49873ea84f01e27c6
17.070009
apache-2.0
21
1.544
true
false
false
true
0.554679
0.499989
49.998892
0.395383
15.106125
0.096677
9.667674
0.259228
1.230425
0.370187
3.840104
0.303191
22.576832
false
false
2024-06-13
2025-03-06
0
DeepMount00/Qwen2-1.5B-Ita_v2
DeepMount00_Qwen2-1.5B-Ita_v3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Qwen2-1.5B-Ita_v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Qwen2-1.5B-Ita_v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Qwen2-1.5B-Ita_v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Qwen2-1.5B-Ita_v3
4faa0ebc54beab39e1f044af1fee3ce44d9b8755
16.948513
apache-2.0
21
1.544
true
false
false
true
0.584626
0.489048
48.904795
0.394848
15.226522
0.10423
10.422961
0.253356
0.447427
0.374156
4.269531
0.301779
22.419843
false
false
2024-06-13
2025-03-06
0
DeepMount00/Qwen2-1.5B-Ita_v3
DeepMount00_Qwen2-1.5B-Ita_v5_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Qwen2-1.5B-Ita_v5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Qwen2-1.5B-Ita_v5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Qwen2-1.5B-Ita_v5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Qwen2-1.5B-Ita_v5
681e6db531df0cc3d7806251659b973ed4ff8c8f
17.023241
apache-2.0
21
1.544
true
false
false
true
0.532105
0.49874
49.874001
0.403204
16.487038
0.117825
11.782477
0.254195
0.559284
0.34225
1.847917
0.294299
21.588726
false
false
2024-06-13
2025-03-10
0
DeepMount00/Qwen2-1.5B-Ita_v5
DeepMount00_Qwen2-1.5B-Ita_v6_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Qwen2-1.5B-Ita_v6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Qwen2-1.5B-Ita_v6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Qwen2-1.5B-Ita_v6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Qwen2-1.5B-Ita_v6
b3360bd6093edb8a98696443405f94ce37a40bd2
14.577672
0
1.497
false
false
false
true
0.607476
0.299904
29.990425
0.424861
19.093804
0.084592
8.459215
0.282718
4.362416
0.375458
4.765625
0.287151
20.794548
false
false
2025-03-10
0
Removed
DeepMount00_Qwen2.5-7B-Instruct-MathCoder_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Qwen2.5-7B-Instruct-MathCoder" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Qwen2.5-7B-Instruct-MathCoder</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Qwen2.5-7B-Instruct-MathCoder-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Qwen2.5-7B-Instruct-MathCoder
90df996cdb1f3d5f051513c50df4cdfda858b5f2
4.39691
0
7.616
false
false
false
true
2.585359
0.153025
15.302508
0.299844
2.636671
0.000755
0.075529
0.262584
1.677852
0.380635
5.379427
0.111785
1.309471
false
false
2024-10-24
0
Removed
DeepMount00_mergekit-ties-okvgjfz_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/mergekit-ties-okvgjfz" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/mergekit-ties-okvgjfz</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__mergekit-ties-okvgjfz-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/mergekit-ties-okvgjfz
90df996cdb1f3d5f051513c50df4cdfda858b5f2
4.39691
0
7.616
false
false
false
true
2.577643
0.153025
15.302508
0.299844
2.636671
0.000755
0.075529
0.262584
1.677852
0.380635
5.379427
0.111785
1.309471
false
false
2024-10-24
0
Removed
Delta-Vector_Baldur-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Delta-Vector/Baldur-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Delta-Vector/Baldur-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Delta-Vector__Baldur-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Delta-Vector/Baldur-8B
97f5d321a8346551a5ed704997dd1e93c59883f3
24.191736
5
8
false
false
false
false
3.060931
0.478182
47.818233
0.530584
32.541834
0.143505
14.350453
0.302013
6.935123
0.437156
14.011198
0.365442
29.493573
false
false
2024-09-23
2024-10-06
1
Delta-Vector/Baldur-8B (Merge)
Delta-Vector_Control-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Delta-Vector/Control-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Delta-Vector/Control-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Delta-Vector__Control-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Delta-Vector/Control-8B
c8743ee5ca0efd31aa9dd1bd14c770430c85a6c1
25.058026
2
8.03
false
false
false
true
1.354972
0.548973
54.897339
0.504146
29.155078
0.138973
13.897281
0.316275
8.836689
0.435542
13.209375
0.373172
30.352394
false
false
2024-10-23
2024-11-25
0
Delta-Vector/Control-8B
Delta-Vector_Control-8B-V1.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Delta-Vector/Control-8B-V1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Delta-Vector/Control-8B-V1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Delta-Vector__Control-8B-V1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Delta-Vector/Control-8B-V1.1
6d4593645d1c4dc61d1c223922f635d79283d22b
24.632509
0
8.03
false
false
false
true
1.281195
0.569656
56.965629
0.499284
28.72585
0.127644
12.76435
0.307047
7.606264
0.423729
11.232813
0.374501
30.500148
false
false
2024-10-30
2024-11-25
0
Delta-Vector/Control-8B-V1.1
Delta-Vector_Darkens-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Delta-Vector/Darkens-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Delta-Vector/Darkens-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Delta-Vector__Darkens-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Delta-Vector/Darkens-8B
e82be0389bfcecd1998dba1c3bb35b8d95d01bf2
18.937415
4
8.414
false
false
false
false
2.399486
0.254766
25.476624
0.525059
32.883795
0.058912
5.891239
0.324664
9.955257
0.410552
9.01901
0.373587
30.398567
false
false
2024-09-22
2024-10-06
1
Delta-Vector/Darkens-8B (Merge)
Delta-Vector_Henbane-7b-attempt2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Delta-Vector/Henbane-7b-attempt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Delta-Vector/Henbane-7b-attempt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Delta-Vector__Henbane-7b-attempt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Delta-Vector/Henbane-7b-attempt2
448ef54e5af03e13f16f3db8ad8d1481479ac12e
23.81395
apache-2.0
1
7
true
false
false
true
2.267676
0.415734
41.573359
0.506118
30.865849
0.227341
22.734139
0.290268
5.369128
0.397344
8.701302
0.402759
33.639923
false
false
2024-09-13
2024-10-11
1
Qwen/Qwen2-7B
Delta-Vector_Odin-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/Delta-Vector/Odin-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Delta-Vector/Odin-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Delta-Vector__Odin-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Delta-Vector/Odin-9B
9ff20f5dd427e751ada834319bfdd9ea60b5e89c
24.977113
4
9.242
false
false
false
false
5.416323
0.369197
36.919706
0.544025
34.832423
0.145015
14.501511
0.341443
12.192394
0.464781
17.564323
0.404671
33.85232
false
false
2024-09-27
2024-10-06
0
Delta-Vector/Odin-9B
Delta-Vector_Tor-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Delta-Vector/Tor-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Delta-Vector/Tor-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Delta-Vector__Tor-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Delta-Vector/Tor-8B
d30a7a121c2ef5dc14004cfdf3fd13208dfbdb4f
18.406879
2
8.414
false
false
false
false
2.504107
0.238155
23.815476
0.520911
31.738224
0.058912
5.891239
0.323826
9.8434
0.409219
8.81901
0.373005
30.333924
false
false
2024-09-21
2024-10-06
1
Delta-Vector/Tor-8B (Merge)
DevQuasar_DevQuasar-R1-Uncensored-Llama-8B_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DevQuasar/DevQuasar-R1-Uncensored-Llama-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DevQuasar/DevQuasar-R1-Uncensored-Llama-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DevQuasar__DevQuasar-R1-Uncensored-Llama-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DevQuasar/DevQuasar-R1-Uncensored-Llama-8B
97a87606addb28c1d76d27cca5e5485c1dbff4e3
26.432649
mit
1
8.03
true
false
false
false
0.719522
0.384884
38.488433
0.511794
30.220238
0.330816
33.081571
0.347315
12.975391
0.443573
14.779948
0.361453
29.05031
true
false
2025-01-28
2025-02-09
1
DevQuasar/DevQuasar-R1-Uncensored-Llama-8B (Merge)
Dongwei_DeepSeek-R1-Distill-Qwen-7B-GRPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Dongwei/DeepSeek-R1-Distill-Qwen-7B-GRPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dongwei/DeepSeek-R1-Distill-Qwen-7B-GRPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dongwei__DeepSeek-R1-Distill-Qwen-7B-GRPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Dongwei/DeepSeek-R1-Distill-Qwen-7B-GRPO
177ffda54582d6e8f3830722d91a3b5c99a38a1d
14.996462
1
7.616
false
false
false
true
1.344661
0.403769
40.376867
0.344257
7.882703
0.195619
19.561934
0.279362
3.914989
0.366281
3.551823
0.232214
14.690455
false
false
2025-02-01
2025-02-05
1
deepseek-ai/DeepSeek-R1-Distill-Qwen-7B
DoppelReflEx_L3-8B-R1-WolfCore_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/L3-8B-R1-WolfCore" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/L3-8B-R1-WolfCore</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__L3-8B-R1-WolfCore-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/L3-8B-R1-WolfCore
b457a83cb3e4468315ccd5a768fd5302d2b9926d
23.481233
cc-by-nc-4.0
1
8.03
true
false
false
false
0.660439
0.37754
37.754048
0.531795
33.760105
0.163142
16.314199
0.328859
10.514541
0.427667
12.358333
0.371676
30.18617
true
false
2025-02-28
2025-02-28
1
DoppelReflEx/L3-8B-R1-WolfCore (Merge)
DoppelReflEx_L3-8B-R1-WolfCore-V1.5-test_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/L3-8B-R1-WolfCore-V1.5-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/L3-8B-R1-WolfCore-V1.5-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__L3-8B-R1-WolfCore-V1.5-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/L3-8B-R1-WolfCore-V1.5-test
044841358609fdc68053b4c6c0a1c41db7e8d327
22.363618
0
8.03
false
false
false
false
0.654733
0.395501
39.550061
0.531495
33.459498
0.123112
12.311178
0.326342
10.178971
0.384073
8.375781
0.372756
30.30622
false
false
2025-03-01
2025-03-01
1
DoppelReflEx/L3-8B-R1-WolfCore-V1.5-test (Merge)
DoppelReflEx_L3-8B-WolfCore_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/L3-8B-WolfCore" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/L3-8B-WolfCore</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__L3-8B-WolfCore-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/L3-8B-WolfCore
e83eab6e8f04065c770bced65bde494599c54cee
21.170087
1
8.03
false
false
false
false
0.653107
0.402195
40.219506
0.518198
31.290072
0.098187
9.818731
0.309564
7.941834
0.397281
7.69349
0.370512
30.056885
false
false
2025-02-28
2025-02-28
1
DoppelReflEx/L3-8B-WolfCore (Merge)
DoppelReflEx_MN-12B-FoxFrame-test_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-FoxFrame-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-FoxFrame-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-FoxFrame-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-FoxFrame-test
b95a2da79360a9da785112ead60214f7b7605e25
23.221062
0
12.248
false
false
false
false
1.552566
0.422203
42.220309
0.545638
34.559814
0.139728
13.97281
0.307886
7.718121
0.425406
13.042448
0.350316
27.812869
false
false
2025-02-06
0
Removed
DoppelReflEx_MN-12B-FoxFrame2-test_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-FoxFrame2-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-FoxFrame2-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-FoxFrame2-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-FoxFrame2-test
322627ea048553a7c30c7351dfe4bff000d979eb
23.639729
cc-by-nc-4.0
2
12.248
true
false
false
false
0.750519
0.431895
43.189515
0.54848
34.9967
0.140483
14.048338
0.314597
8.612975
0.425188
12.448437
0.356882
28.542405
true
false
2025-02-08
2025-02-08
1
DoppelReflEx/MN-12B-FoxFrame2-test (Merge)
DoppelReflEx_MN-12B-FoxFrame3-test_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-FoxFrame3-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-FoxFrame3-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-FoxFrame3-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-FoxFrame3-test
a300b18573c9bcb4702d84e686fa826e7b695686
23.947188
cc-by-nc-4.0
1
12.248
true
false
false
false
0.697196
0.43232
43.231958
0.539476
34.041186
0.132175
13.217523
0.301174
6.823266
0.45976
18.270052
0.352892
28.099143
true
false
2025-02-08
2025-02-08
1
DoppelReflEx/MN-12B-FoxFrame3-test (Merge)
DoppelReflEx_MN-12B-Kakigori_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-Kakigori" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-Kakigori</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-Kakigori-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-Kakigori
43cdb3d3df47f5d4ed8386f411859b9d72ea9017
21.697733
cc-by-nc-4.0
2
12.248
true
false
false
false
1.59566
0.35933
35.932991
0.541553
34.331347
0.119335
11.933535
0.324664
9.955257
0.405219
9.352344
0.358128
28.680925
true
false
2025-01-29
2025-01-29
1
DoppelReflEx/MN-12B-Kakigori (Merge)
DoppelReflEx_MN-12B-LilithFrame_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-LilithFrame" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-LilithFrame</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-LilithFrame-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-LilithFrame
e3e8cce8267613d5c2ff68884aaeac8ab9b39e93
21.32106
0
12.248
false
false
false
false
1.856507
0.450955
45.095458
0.494426
27.492064
0.115559
11.555891
0.319631
9.284116
0.389563
9.428646
0.325632
25.070183
false
false
2025-01-29
0
Removed
DoppelReflEx_MN-12B-LilithFrame_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-LilithFrame" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-LilithFrame</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-LilithFrame-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-LilithFrame
e3e8cce8267613d5c2ff68884aaeac8ab9b39e93
20.02254
0
12.248
false
false
false
false
0.928005
0.436042
43.604192
0.495613
27.653498
0.058912
5.891239
0.32047
9.395973
0.38426
8.732552
0.32372
24.857787
false
false
2025-01-29
0
Removed
DoppelReflEx_MN-12B-LilithFrame-Experiment-2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-LilithFrame-Experiment-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-LilithFrame-Experiment-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-LilithFrame-Experiment-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-LilithFrame-Experiment-2
75316e8ed913cf62482f36713a007d471813bb0e
21.00209
0
12.248
false
false
false
false
1.839135
0.429947
42.994699
0.498267
28.111183
0.107251
10.725076
0.325503
10.067114
0.380448
8.822656
0.327626
25.291814
false
false
2025-01-29
0
Removed
DoppelReflEx_MN-12B-LilithFrame-Experiment-3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-LilithFrame-Experiment-3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-LilithFrame-Experiment-3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-LilithFrame-Experiment-3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-LilithFrame-Experiment-3
e33ca2d80584a934a6c2ed1a9ba788b8998d0d15
23.139283
0
12.248
false
false
false
false
2.362723
0.412786
41.278585
0.546808
34.998286
0.134441
13.444109
0.32802
10.402685
0.403854
9.781771
0.360372
28.93026
false
false
2025-01-29
0
Removed
DoppelReflEx_MN-12B-LilithFrame-Experiment-4_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-LilithFrame-Experiment-4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-LilithFrame-Experiment-4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-LilithFrame-Experiment-4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-LilithFrame-Experiment-4
242fffeb766e1de3e7040cb7a981fc9fb37ada3c
23.528621
cc-by-nc-4.0
1
12.248
true
false
false
false
1.643713
0.398148
39.814803
0.553437
35.77765
0.122356
12.23565
0.317114
8.948546
0.437062
14.966146
0.36486
29.42893
true
false
2025-01-30
2025-01-30
1
DoppelReflEx/MN-12B-LilithFrame-Experiment-4 (Merge)
DoppelReflEx_MN-12B-Mimicore-GreenSnake_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-Mimicore-GreenSnake" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-Mimicore-GreenSnake</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-Mimicore-GreenSnake-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-Mimicore-GreenSnake
c1aee5ad2926129a5299e264a33c3890eb83cb8f
25.015013
cc-by-nc-4.0
2
12.248
true
false
false
false
1.688602
0.478007
47.800724
0.548051
35.390601
0.138973
13.897281
0.324664
9.955257
0.430583
13.589583
0.36511
29.456634
true
false
2025-01-27
2025-01-27
1
DoppelReflEx/MN-12B-Mimicore-GreenSnake (Merge)
DoppelReflEx_MN-12B-Mimicore-Nocturne_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-Mimicore-Nocturne" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-Mimicore-Nocturne</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-Mimicore-Nocturne-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-Mimicore-Nocturne
5cea74ebd1b0a4b3043e2789e21aa68706a9d817
24.066302
cc-by-nc-4.0
2
12.248
true
false
false
false
0.87958
0.39565
39.565021
0.570333
38.398668
0.10574
10.574018
0.319631
9.284116
0.456906
17.313281
0.363364
29.262707
true
false
2025-03-08
2025-03-09
1
DoppelReflEx/MN-12B-Mimicore-Nocturne (Merge)
DoppelReflEx_MN-12B-Mimicore-Orochi_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-Mimicore-Orochi" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-Mimicore-Orochi</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-Mimicore-Orochi-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-Mimicore-Orochi
59515c9a5224bb45a1d2a7ea141e37a5ab9a9021
24.652223
cc-by-nc-4.0
2
12.248
true
false
false
false
1.520963
0.462045
46.204515
0.549774
35.28323
0.135952
13.595166
0.312919
8.389262
0.454583
17.25625
0.344664
27.184914
true
false
2025-01-28
2025-01-28
1
DoppelReflEx/MN-12B-Mimicore-Orochi (Merge)
DoppelReflEx_MN-12B-Mimicore-Orochi-v2-Experiment_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-Mimicore-Orochi-v2-Experiment" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-Mimicore-Orochi-v2-Experiment</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-Mimicore-Orochi-v2-Experiment-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-Mimicore-Orochi-v2-Experiment
b0140973cf249ecb2ba399f1174f8229c91dc363
19.804011
0
12.248
false
false
false
false
1.102281
0.284241
28.424137
0.532253
32.774711
0.061178
6.117825
0.297819
6.375839
0.457375
18.205208
0.342337
26.926345
false
false
2025-01-28
0
Removed
DoppelReflEx_MN-12B-Mimicore-Orochi-v3-Experiment_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-Mimicore-Orochi-v3-Experiment" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-Mimicore-Orochi-v3-Experiment</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-Mimicore-Orochi-v3-Experiment-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-Mimicore-Orochi-v3-Experiment
d1f9bd2cd64564217f59802648a941a57b2b9733
22.641023
0
12.248
false
false
false
false
1.334929
0.410163
41.016281
0.543782
34.56948
0.121601
12.160121
0.292785
5.704698
0.443792
15.773958
0.339594
26.621602
false
false
2025-01-28
0
Removed
DoppelReflEx_MN-12B-Mimicore-Orochi-v4-Experiment_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-Mimicore-Orochi-v4-Experiment" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-Mimicore-Orochi-v4-Experiment</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-Mimicore-Orochi-v4-Experiment-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-Mimicore-Orochi-v4-Experiment
41bc20297c95adc8bc1d2e993110f671907f0c32
23.575775
0
12.248
false
false
false
false
1.886454
0.43207
43.207024
0.54625
35.299068
0.120846
12.084592
0.305369
7.38255
0.444938
15.483854
0.351978
27.997562
false
false
2025-01-28
0
Removed
DoppelReflEx_MN-12B-Mimicore-WhiteSnake_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-Mimicore-WhiteSnake" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-Mimicore-WhiteSnake</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-Mimicore-WhiteSnake-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-Mimicore-WhiteSnake
ca84b8ab989a61658fc17e270b7344ed3885071f
25.05856
cc-by-nc-4.0
3
12.248
true
false
false
false
1.598794
0.44376
44.376033
0.560461
36.89971
0.13142
13.141994
0.317953
9.060403
0.456875
17.342708
0.365775
29.530511
true
false
2025-01-27
2025-01-27
1
DoppelReflEx/MN-12B-Mimicore-WhiteSnake (Merge)
DoppelReflEx_MN-12B-Mimicore-WhiteSnake-v2-Experiment-1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-Mimicore-WhiteSnake-v2-Experiment-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-1
f1fb881039e54ac80d84298b9054773a2bd72d21
18.866679
0
12.248
false
false
false
false
1.876918
0.390904
39.090391
0.486564
27.077964
0.07855
7.854985
0.305369
7.38255
0.378958
8.303125
0.31142
23.491061
false
false
2025-01-29
0
Removed
DoppelReflEx_MN-12B-Mimicore-WhiteSnake-v2-Experiment-2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-Mimicore-WhiteSnake-v2-Experiment-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-2
19.422543
0
12.248
false
false
false
false
2.634526
0.312393
31.239334
0.51264
30.66572
0.112538
11.253776
0.296141
6.152125
0.397469
11.516927
0.331366
25.707373
false
false
2025-01-29
0
Removed
DoppelReflEx_MN-12B-Mimicore-WhiteSnake-v2-Experiment-3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-Mimicore-WhiteSnake-v2-Experiment-3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-3
12985da577e2bdcba11ad75b4aad6cf07cb67b51
19.601521
0
12.248
false
false
false
false
1.814318
0.430222
43.022181
0.48118
26.321395
0.089879
8.987915
0.302013
6.935123
0.368417
7.91875
0.319814
24.423759
false
false
2025-01-29
0
Removed
DoppelReflEx_MN-12B-Mimicore-WhiteSnake-v2-Experiment-4_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-Mimicore-WhiteSnake-v2-Experiment-4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-4
b7ec319e84b66dba6c620b9b01dc579cad96eb8d
21.794341
cc-by-nc-4.0
5
12.248
true
false
false
false
1.72712
0.424052
42.405152
0.518475
31.422947
0.114048
11.404834
0.310403
8.053691
0.400198
11.458073
0.334192
26.02135
true
false
2025-01-29
2025-01-29
1
DoppelReflEx/MN-12B-Mimicore-WhiteSnake-v2-Experiment-4 (Merge)
DoppelReflEx_MN-12B-Unleashed-Twilight_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-Unleashed-Twilight" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-Unleashed-Twilight</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-Unleashed-Twilight-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-Unleashed-Twilight
47bb9e79f33f659c911843c874ac29653a8c4a7b
22.564272
1
12.248
false
false
false
false
0.814314
0.350512
35.05122
0.552063
35.976107
0.095921
9.592145
0.328859
10.514541
0.438396
14.499479
0.367769
29.752142
false
false
2025-02-09
2025-02-10
1
DoppelReflEx/MN-12B-Unleashed-Twilight (Merge)
DoppelReflEx_MN-12B-WolFrame_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MN-12B-WolFrame" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MN-12B-WolFrame</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MN-12B-WolFrame-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MN-12B-WolFrame
44ef103ff2b5ba1bfa9e375357ea1c897cb33788
22.07872
cc-by-nc-4.0
5
12.248
true
false
false
false
1.681375
0.439739
43.973878
0.511681
29.99193
0.13142
13.141994
0.310403
8.053691
0.401469
10.716927
0.339345
26.593898
true
false
2025-01-29
2025-02-01
1
DoppelReflEx/MN-12B-WolFrame (Merge)
DoppelReflEx_MiniusLight-24B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MiniusLight-24B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MiniusLight-24B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MiniusLight-24B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MiniusLight-24B
3bb87fa4b45b5554a1bdd8554302ed1a22a3c3ef
26.21034
cc-by-nc-4.0
2
23.572
true
false
false
false
1.44034
0.257664
25.766411
0.625646
46.002969
0.126133
12.613293
0.358221
14.42953
0.431917
12.989583
0.509142
45.460254
true
false
2025-03-07
2025-03-04
1
DoppelReflEx/MiniusLight-24B (Merge)
DoppelReflEx_MiniusLight-24B-test_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MiniusLight-24B-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MiniusLight-24B-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MiniusLight-24B-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MiniusLight-24B-test
b71988742288492a5728e795e2dc4a0114178835
20.837213
0
23.572
false
false
false
false
0.578043
0.039368
3.936777
0.633393
46.956966
0.02568
2.567976
0.368289
15.771812
0.40925
9.322917
0.518201
46.466829
false
false
2025-03-04
0
Removed
DoppelReflEx_MiniusLight-24B-v1b-test_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MiniusLight-24B-v1b-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MiniusLight-24B-v1b-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MiniusLight-24B-v1b-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MiniusLight-24B-v1b-test
d2ec8d77a022b2ad2e207ea882a595aad591de2b
32.374894
0
23.572
false
false
false
false
1.374665
0.379114
37.911408
0.661715
50.638148
0.239426
23.942598
0.379195
17.225951
0.455729
16.032813
0.536486
48.498449
false
false
2025-03-04
0
Removed
DoppelReflEx_MiniusLight-24B-v1c-test_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MiniusLight-24B-v1c-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MiniusLight-24B-v1c-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MiniusLight-24B-v1c-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MiniusLight-24B-v1c-test
07fbae508e6e796a33439d40a543f8bd60c6c047
34.408318
cc-by-nc-4.0
4
23.572
true
false
false
false
4.322545
0.378589
37.858881
0.675268
52.840658
0.296828
29.682779
0.395134
19.35123
0.463417
16.860417
0.548703
49.85594
true
false
2025-03-04
2025-03-04
1
DoppelReflEx/MiniusLight-24B-v1c-test (Merge)
DoppelReflEx_MiniusLight-24B-v1d-test_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DoppelReflEx/MiniusLight-24B-v1d-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DoppelReflEx/MiniusLight-24B-v1d-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DoppelReflEx__MiniusLight-24B-v1d-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DoppelReflEx/MiniusLight-24B-v1d-test
26bb7f9b94257b717afe96e6d19f05141ebe89ac
34.681949
cc-by-nc-4.0
2
23.572
true
false
false
false
1.497397
0.403243
40.324339
0.671203
52.358441
0.294562
29.456193
0.395134
19.35123
0.462083
16.727083
0.54887
49.874409
true
false
2025-03-07
2025-03-07
1
DoppelReflEx/MiniusLight-24B-v1d-test (Merge)
DreadPoor_Again-8B-Model_Stock_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Again-8B-Model_Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Again-8B-Model_Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Again-8B-Model_Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Again-8B-Model_Stock
10052b086c6896ccd9d26522c45d348f1607c33c
26.002388
0
4.015
false
false
false
true
1.360654
0.672421
67.24214
0.53098
33.259461
0.120091
12.009063
0.301174
6.823266
0.398677
8.701302
0.351812
27.979093
false
false
2024-12-17
0
Removed
DreadPoor_Alita99-8B-LINEAR_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Alita99-8B-LINEAR" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Alita99-8B-LINEAR</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Alita99-8B-LINEAR-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Alita99-8B-LINEAR
cfffa050f433660fc6159a82ce09fc2841fa0b6c
29.392264
apache-2.0
1
8.03
true
false
false
true
1.317245
0.719008
71.900779
0.544177
35.008918
0.164653
16.465257
0.316275
8.836689
0.426646
12.930729
0.380901
31.211215
true
false
2024-11-25
2024-11-26
1
DreadPoor/Alita99-8B-LINEAR (Merge)
DreadPoor_AnotherTest_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/AnotherTest" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/AnotherTest</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__AnotherTest-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/AnotherTest
40182ce563447e082186414c62e15af7fc33a431
19.505171
0
8.03
false
false
false
true
1.518019
0.470064
47.006387
0.468341
25.197138
0.061934
6.193353
0.297819
6.375839
0.421281
11.426823
0.287483
20.831486
false
false
2025-01-29
0
Removed
DreadPoor_Aspire-8B-model_stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Aspire-8B-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Aspire-8B-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Aspire-8B-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Aspire-8B-model_stock
5c23cb2aff877d0b7bdcfa4de43d1bc8a1852de0
28.611282
cc-by-nc-4.0
6
8.03
true
false
false
true
1.686256
0.714062
71.406202
0.527825
32.53427
0.149547
14.954683
0.314597
8.612975
0.42125
13.45625
0.37633
30.70331
true
false
2024-09-16
2024-09-17
1
DreadPoor/Aspire-8B-model_stock (Merge)
DreadPoor_Aspire_1.3-8B_model-stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Aspire_1.3-8B_model-stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Aspire_1.3-8B_model-stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Aspire_1.3-8B_model-stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Aspire_1.3-8B_model-stock
d36f5540e8c5654a9fdd8ece9ba8e88af26e5c40
28.388802
0
8.03
false
false
false
true
1.431563
0.706169
70.616852
0.530164
32.661851
0.169184
16.918429
0.307886
7.718121
0.410458
12.240625
0.371592
30.176936
false
false
2024-11-01
0
Removed
DreadPoor_Aspire_V2-8B-Model_Stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Aspire_V2-8B-Model_Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Aspire_V2-8B-Model_Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Aspire_V2-8B-Model_Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Aspire_V2-8B-Model_Stock
e482d8852ec50b05420b865d27b7ed4682ab5ac8
29.023158
0
8.03
false
false
false
true
1.339613
0.737143
73.7143
0.532965
33.327406
0.175982
17.598187
0.32047
9.395973
0.389375
10.138542
0.369681
29.964539
false
false
2025-01-20
0
Removed
DreadPoor_Aspire_V2.1-8B-Model_Stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Aspire_V2.1-8B-Model_Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Aspire_V2.1-8B-Model_Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Aspire_V2.1-8B-Model_Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Aspire_V2.1-8B-Model_Stock
c8b0acb6e3b5120cbdad9e6b2acf03ae9e9d1a0f
28.738387
0
8.03
false
false
false
true
1.350701
0.723754
72.375408
0.52364
32.187945
0.176737
17.673716
0.309564
7.941834
0.413594
11.132552
0.38007
31.118868
false
false
2025-01-20
0
Removed
DreadPoor_Aspire_V2_ALT-8B-Model_Stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Aspire_V2_ALT-8B-Model_Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Aspire_V2_ALT-8B-Model_Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Aspire_V2_ALT-8B-Model_Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Aspire_V2_ALT-8B-Model_Stock
70e838e725b5f3889228103c1ee21f6eb7b0919c
29.059893
0
8.03
false
false
false
true
1.320314
0.738117
73.811708
0.526582
32.445169
0.172961
17.296073
0.324664
9.955257
0.3975
10.554167
0.372673
30.296986
false
false
2025-01-20
0
Removed
DreadPoor_Aspire_V2_ALT_ROW-8B-Model_Stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Aspire_V2_ALT_ROW-8B-Model_Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Aspire_V2_ALT_ROW-8B-Model_Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Aspire_V2_ALT_ROW-8B-Model_Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Aspire_V2_ALT_ROW-8B-Model_Stock
7402061b436bbebb8b74b9f216cd8c788937a8f1
29.059893
0
8.03
false
false
false
true
1.290023
0.738117
73.811708
0.526582
32.445169
0.172961
17.296073
0.324664
9.955257
0.3975
10.554167
0.372673
30.296986
false
false
2025-01-20
0
Removed
DreadPoor_Aspire_V3-8B-Model_Stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Aspire_V3-8B-Model_Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Aspire_V3-8B-Model_Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Aspire_V3-8B-Model_Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Aspire_V3-8B-Model_Stock
51a86cfb6f0067d113d31473399e34f13bb83d75
25.135023
0
8.03
false
false
false
true
1.313865
0.51188
51.187959
0.526796
32.683682
0.185801
18.58006
0.305369
7.38255
0.4015
11.620833
0.364195
29.355053
false
false
2025-01-21
0
Removed
DreadPoor_Aspire_V4-8B-Model_Stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Aspire_V4-8B-Model_Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Aspire_V4-8B-Model_Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Aspire_V4-8B-Model_Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Aspire_V4-8B-Model_Stock
6a16bacdd20cb1a75a7b31376b46f7be73f8b02f
29.369067
0
8.03
false
false
false
true
1.350246
0.769416
76.941626
0.531404
33.205989
0.192598
19.259819
0.30453
7.270694
0.38674
9.442448
0.370844
30.093824
false
false
2025-01-22
0
Removed