eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
64 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.09k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.22
0.83
BBH
float64
0.25
76.7
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.7
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
525 values
Submission Date
stringclasses
263 values
Generation
int64
0
10
Base Model
stringlengths
4
102
bamec66557_VICIOUS_MESH-12B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/bamec66557/VICIOUS_MESH-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bamec66557/VICIOUS_MESH-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bamec66557__VICIOUS_MESH-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bamec66557/VICIOUS_MESH-12B
4a0a2cf1eca5766badb8ff2853e15f045de71a92
22.780364
0
6.124
false
false
false
false
3.360133
0.37157
37.156966
0.543602
34.372516
0.134441
13.444109
0.32802
10.402685
0.41049
11.544531
0.367852
29.761377
false
false
2024-12-18
0
Removed
bamec66557_VICIOUS_MESH-12B-0.1v_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/bamec66557/VICIOUS_MESH-12B-0.1v" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bamec66557/VICIOUS_MESH-12B-0.1v</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bamec66557__VICIOUS_MESH-12B-0.1v-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bamec66557/VICIOUS_MESH-12B-0.1v
14e82cd2858767003bed53db1c0de82f6c7dd9bf
22.449453
0
6.124
false
false
false
false
2.148918
0.36575
36.574954
0.541228
34.130238
0.132175
13.217523
0.324664
9.955257
0.415823
11.011198
0.368268
29.80755
false
false
2024-12-18
0
Removed
bamec66557_VICIOUS_MESH-12B-0.X.ver_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/bamec66557/VICIOUS_MESH-12B-0.X.ver" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bamec66557/VICIOUS_MESH-12B-0.X.ver</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bamec66557__VICIOUS_MESH-12B-0.X.ver-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bamec66557/VICIOUS_MESH-12B-0.X.ver
93bdb2c1d5644217e5f5e9bcbf669b18e3b05851
22.653126
0
6.124
false
false
false
false
3.375983
0.377565
37.756486
0.541625
34.089247
0.120091
12.009063
0.321309
9.50783
0.419823
12.877865
0.367104
29.678265
false
false
2024-12-18
0
Removed
bamec66557_VICIOUS_MESH-12B-ALPHA_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/bamec66557/VICIOUS_MESH-12B-ALPHA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bamec66557/VICIOUS_MESH-12B-ALPHA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bamec66557__VICIOUS_MESH-12B-ALPHA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bamec66557/VICIOUS_MESH-12B-ALPHA
0db2b2268828869387868fb01f14217a97d28d2b
26.455512
apache-2.0
1
12.248
true
false
false
true
1.905354
0.636501
63.650115
0.509368
30.510146
0.136707
13.670695
0.313758
8.501119
0.420292
12.436458
0.369681
29.964539
true
false
2024-12-20
2024-12-20
1
bamec66557/VICIOUS_MESH-12B-ALPHA (Merge)
bamec66557_VICIOUS_MESH-12B-BETA_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/bamec66557/VICIOUS_MESH-12B-BETA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bamec66557/VICIOUS_MESH-12B-BETA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bamec66557__VICIOUS_MESH-12B-BETA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bamec66557/VICIOUS_MESH-12B-BETA
e5ee09e29f2079a0422b5b66c64a91b8d65ee13f
27.466293
apache-2.0
3
12.248
true
false
false
true
1.913221
0.672097
67.20967
0.515596
31.356607
0.132931
13.293051
0.316275
8.836689
0.43099
14.340365
0.367852
29.761377
true
false
2024-12-20
2024-12-20
1
bamec66557/VICIOUS_MESH-12B-BETA (Merge)
bamec66557_VICIOUS_MESH-12B-DELTA_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/bamec66557/VICIOUS_MESH-12B-DELTA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bamec66557/VICIOUS_MESH-12B-DELTA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bamec66557__VICIOUS_MESH-12B-DELTA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bamec66557/VICIOUS_MESH-12B-DELTA
b78353e18ccee5445cca15fa2e558cb77fd54bfe
25.939276
0
6.124
false
false
false
true
2.169282
0.646892
64.689247
0.505542
29.792447
0.137462
13.746224
0.312081
8.277405
0.405656
9.673698
0.36511
29.456634
false
false
2024-12-21
0
Removed
bamec66557_VICIOUS_MESH-12B-DIGAMMA_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/bamec66557/VICIOUS_MESH-12B-DIGAMMA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bamec66557/VICIOUS_MESH-12B-DIGAMMA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bamec66557__VICIOUS_MESH-12B-DIGAMMA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bamec66557/VICIOUS_MESH-12B-DIGAMMA
1cda02ade49cdc99cbfec09b3204b3e899ec1eb3
25.971654
0
6.124
false
false
false
true
1.98823
0.642921
64.292078
0.506117
29.83323
0.133686
13.36858
0.312919
8.389262
0.409656
10.407031
0.365858
29.539746
false
false
2024-12-21
0
Removed
bamec66557_VICIOUS_MESH-12B-EPSILON_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/bamec66557/VICIOUS_MESH-12B-EPSILON" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bamec66557/VICIOUS_MESH-12B-EPSILON</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bamec66557__VICIOUS_MESH-12B-EPSILON-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bamec66557/VICIOUS_MESH-12B-EPSILON
882d08d9cd7a45aa36c5f31c13014dcf033a6b37
25.50473
0
6.124
false
false
false
true
2.092369
0.630456
63.045608
0.5038
29.596445
0.126133
12.613293
0.314597
8.612975
0.40699
9.740365
0.364777
29.419696
false
false
2024-12-21
0
Removed
bamec66557_VICIOUS_MESH-12B-GAMMA_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/bamec66557/VICIOUS_MESH-12B-GAMMA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bamec66557/VICIOUS_MESH-12B-GAMMA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bamec66557__VICIOUS_MESH-12B-GAMMA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bamec66557/VICIOUS_MESH-12B-GAMMA
b76049bd9081d7b3151f0cab6f2d804e3804bba0
26.916849
apache-2.0
2
12.248
true
false
false
true
1.803639
0.636176
63.617646
0.518191
31.485975
0.130665
13.066465
0.313758
8.501119
0.436323
15.207031
0.366606
29.622858
true
false
2024-12-20
2024-12-20
1
bamec66557/VICIOUS_MESH-12B-GAMMA (Merge)
bamec66557_VICIOUS_MESH-12B-NEMO_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/bamec66557/VICIOUS_MESH-12B-NEMO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bamec66557/VICIOUS_MESH-12B-NEMO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bamec66557__VICIOUS_MESH-12B-NEMO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bamec66557/VICIOUS_MESH-12B-NEMO
21aa15e9d0425338d76c7f0dc7af70fb30bc3ad2
23.316444
apache-2.0
1
12.248
true
false
false
false
2.954811
0.402219
40.221944
0.544168
34.401417
0.126888
12.688822
0.323826
9.8434
0.425062
12.566146
0.371592
30.176936
true
false
2024-12-21
2024-12-22
1
bamec66557/VICIOUS_MESH-12B-NEMO (Merge)
bamec66557_VICIOUS_MESH-12B-OMEGA_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/bamec66557/VICIOUS_MESH-12B-OMEGA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bamec66557/VICIOUS_MESH-12B-OMEGA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bamec66557__VICIOUS_MESH-12B-OMEGA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bamec66557/VICIOUS_MESH-12B-OMEGA
4af6dbc11346fc9efdd7d44105cb12f47cfad220
27.495545
apache-2.0
3
12.248
true
false
false
true
1.994729
0.669973
66.997345
0.516644
31.523712
0.134441
13.444109
0.315436
8.724832
0.432323
14.540365
0.367686
29.742908
true
false
2024-12-22
2024-12-22
1
bamec66557/VICIOUS_MESH-12B-OMEGA (Merge)
bamec66557_VICIOUS_MESH-12B-UNION_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/bamec66557/VICIOUS_MESH-12B-UNION" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bamec66557/VICIOUS_MESH-12B-UNION</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bamec66557__VICIOUS_MESH-12B-UNION-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bamec66557/VICIOUS_MESH-12B-UNION
f1331f2993652945d79ba6b4a67a2bf6876f269a
26.67624
0
6.124
false
false
false
true
2.033248
0.642871
64.287092
0.510664
30.463892
0.138973
13.897281
0.312081
8.277405
0.425688
13.444271
0.367188
29.6875
false
false
2024-12-21
0
Removed
bamec66557_VICIOUS_MESH-12B_Razor_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/bamec66557/VICIOUS_MESH-12B_Razor" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bamec66557/VICIOUS_MESH-12B_Razor</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bamec66557__VICIOUS_MESH-12B_Razor-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bamec66557/VICIOUS_MESH-12B_Razor
cdbbf14d884c8bf7c4ae4cb5e2d30425a5340cfe
22.640682
0
6.124
false
false
false
false
2.141617
0.373643
37.364304
0.544713
34.562212
0.129909
12.990937
0.322987
9.731544
0.409156
11.544531
0.366855
29.650561
false
false
2024-12-18
0
Removed
bamec66557_mergekit-model_stock-zdaysvi_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/bamec66557/mergekit-model_stock-zdaysvi" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bamec66557/mergekit-model_stock-zdaysvi</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bamec66557__mergekit-model_stock-zdaysvi-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bamec66557/mergekit-model_stock-zdaysvi
7ee1c3a87fe903c165be1393c7728fd8da001a86
26.244516
0
6.124
false
false
false
true
1.931984
0.642596
64.259609
0.50628
30.16636
0.135196
13.519637
0.313758
8.501119
0.412385
11.148177
0.36885
29.872193
false
false
2024-12-20
0
Removed
bamec66557_mergekit-ties-sinbkow_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/bamec66557/mergekit-ties-sinbkow" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bamec66557/mergekit-ties-sinbkow</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bamec66557__mergekit-ties-sinbkow-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bamec66557/mergekit-ties-sinbkow
871b4b0c0c77675c933aca9bc0573e85189a0a57
26.284691
0
6.124
false
false
false
true
1.800477
0.643196
64.319561
0.509208
30.622038
0.145015
14.501511
0.319631
9.284116
0.404479
10.059896
0.360289
28.921025
false
false
2024-12-21
0
Removed
belztjti_dffghgjh_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GlmForCausalLM
<a target="_blank" href="https://huggingface.co/belztjti/dffghgjh" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">belztjti/dffghgjh</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/belztjti__dffghgjh-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
belztjti/dffghgjh
20a115228627e753f03f876ad3c437d00aa5caf0
16.67122
0
9.543
false
false
false
true
2.321865
0.578424
57.842414
0.358171
9.713639
0.023414
2.34139
0.263423
1.789709
0.347458
1.432292
0.342171
26.907875
false
false
2025-01-30
0
Removed
belztjti_dtfgv_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/belztjti/dtfgv" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">belztjti/dtfgv</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/belztjti__dtfgv-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
belztjti/dtfgv
014ea7eab9cb8fb71d661369bdaf34fe6a64b3f6
9.007601
0
9.543
false
false
false
true
2.865628
0.33445
33.445037
0.328153
5.52045
0.018127
1.812689
0.269295
2.572707
0.379396
5.091146
0.150432
5.603576
false
false
2025-01-29
0
Removed
benhaotang_phi4-qwq-sky-t1_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/benhaotang/phi4-qwq-sky-t1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">benhaotang/phi4-qwq-sky-t1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/benhaotang__phi4-qwq-sky-t1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
benhaotang/phi4-qwq-sky-t1
de04971083243b7cfe0447e20badb1847bd7cef6
31.018253
mit
2
14.66
true
false
false
false
1.901462
0.045962
4.596249
0.671052
52.6124
0.410121
41.012085
0.395134
19.35123
0.489958
21.378125
0.524435
47.159427
true
false
2025-01-17
2025-01-21
1
benhaotang/phi4-qwq-sky-t1 (Merge)
beomi_gemma-mling-7b_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/beomi/gemma-mling-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">beomi/gemma-mling-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/beomi__gemma-mling-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
beomi/gemma-mling-7b
3f442e28bd50db6c438ce2a15b3a003532babba0
11.392174
other
14
8.538
true
false
false
false
3.287011
0.202909
20.290939
0.406759
17.631391
0.054381
5.438066
0.25
0
0.375854
6.848437
0.263298
18.144208
false
false
2024-04-15
2024-07-17
0
beomi/gemma-mling-7b
beowolx_CodeNinja-1.0-OpenChat-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/beowolx/CodeNinja-1.0-OpenChat-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">beowolx/CodeNinja-1.0-OpenChat-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/beowolx__CodeNinja-1.0-OpenChat-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
beowolx/CodeNinja-1.0-OpenChat-7B
9934c04c767e6ae0f792712a060f02915391d4ec
20.460682
mit
104
7.242
true
false
false
true
1.272145
0.544677
54.467701
0.444134
21.713423
0.067221
6.722054
0.294463
5.928412
0.424323
11.540365
0.301529
22.392139
false
false
2023-12-20
2024-07-30
0
beowolx/CodeNinja-1.0-OpenChat-7B
berkeley-nest_Starling-LM-7B-alpha_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/berkeley-nest/Starling-LM-7B-alpha" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">berkeley-nest/Starling-LM-7B-alpha</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/berkeley-nest__Starling-LM-7B-alpha-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
berkeley-nest/Starling-LM-7B-alpha
1dddf3b95bc1391f6307299eb1c162c194bde9bd
20.839361
apache-2.0
558
7.242
true
false
false
true
1.103258
0.548049
54.804918
0.444007
21.954028
0.083837
8.383686
0.29698
6.263982
0.41201
9.501302
0.317154
24.128251
false
true
2023-11-25
2024-06-12
0
berkeley-nest/Starling-LM-7B-alpha
bfuzzy1_Gunny_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bfuzzy1/Gunny" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bfuzzy1/Gunny</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bfuzzy1__Gunny-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bfuzzy1/Gunny
4648b9fafbbf5871fef317cdf9b76c3b7da6d66d
23.34108
0
3.213
false
false
false
true
1.667573
0.712863
71.286298
0.454599
22.991779
0.172961
17.296073
0.278523
3.803132
0.358281
2.01849
0.303856
22.650709
false
false
2024-11-04
2024-12-20
0
bfuzzy1/Gunny
bfuzzy1_acheron_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bfuzzy1/acheron" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bfuzzy1/acheron</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bfuzzy1__acheron-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bfuzzy1/acheron
10f0384c0363f63a17f41f1cf09f9a317a3ee957
4.974673
0
0.514
false
false
false
false
0.325365
0.198313
19.83127
0.310792
3.737588
0.016616
1.661631
0.239094
0
0.351052
3.548177
0.109624
1.069371
false
false
2024-10-12
2024-12-24
1
bfuzzy1/acheron (Merge)
bfuzzy1_acheron-c_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bfuzzy1/acheron-c" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bfuzzy1/acheron-c</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bfuzzy1__acheron-c-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bfuzzy1/acheron-c
dd9b3e9f550ab5c48a8349ddfea534996f4a28c4
4.29082
0
0.514
false
false
false
true
0.498206
0.192867
19.286715
0.302607
2.769027
0.003021
0.302115
0.247483
0
0.338219
1.477344
0.117188
1.909722
false
false
2024-12-23
2024-12-23
1
bfuzzy1/acheron-c (Merge)
bfuzzy1_acheron-d_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bfuzzy1/acheron-d" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bfuzzy1/acheron-d</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bfuzzy1__acheron-d-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bfuzzy1/acheron-d
337fc6d265062b22b368debdf42deb10af58b25e
4.988235
0
0.514
false
false
false
false
0.327473
0.192542
19.254245
0.313996
4.122247
0.015106
1.510574
0.236577
0
0.349719
3.548177
0.113447
1.494164
false
false
2025-01-03
2025-01-03
1
bfuzzy1/acheron-d (Merge)
bfuzzy1_acheron-m_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bfuzzy1/acheron-m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bfuzzy1/acheron-m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bfuzzy1__acheron-m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bfuzzy1/acheron-m
15a592a961c17aca78b16851ef42fd55b24c2d09
4.225198
other
0
0.514
true
false
false
true
0.373626
0.175831
17.583124
0.292844
2.182041
0.009063
0.906344
0.260067
1.342282
0.348667
2.083333
0.111287
1.254063
false
false
2025-01-10
2025-01-10
2
bfuzzy1/acheron-d (Merge)
bfuzzy1_acheron-m1a-llama_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bfuzzy1/acheron-m1a-llama" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bfuzzy1/acheron-m1a-llama</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bfuzzy1__acheron-m1a-llama-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bfuzzy1/acheron-m1a-llama
f6b5f9afe3163767fa51dd1dd66d2fcd829ffc7d
3.348613
other
0
0.514
true
false
false
true
1.07387
0.112458
11.245828
0.295605
2.545409
0.007553
0.755287
0.260067
1.342282
0.363302
2.579427
0.114611
1.623449
false
false
2025-01-10
2025-01-10
3
bfuzzy1/acheron-d (Merge)
bfuzzy1_llambses-1_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/bfuzzy1/llambses-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bfuzzy1/llambses-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bfuzzy1__llambses-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bfuzzy1/llambses-1
73d190c1726f22de8bb1be333d93cfeebb550984
19.837073
apache-2.0
0
7.242
true
false
false
false
0.936583
0.355384
35.538372
0.504698
31.077833
0.068731
6.873112
0.297819
6.375839
0.452906
15.379948
0.313996
23.777335
true
false
2024-10-10
2025-01-03
1
bfuzzy1/llambses-1 (Merge)
bhuvneshsaini_merged_model_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bhuvneshsaini/merged_model" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bhuvneshsaini/merged_model</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bhuvneshsaini__merged_model-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bhuvneshsaini/merged_model
35d045ae54b9bdf334b1c28becd85746cf4e9a38
5.795749
mit
0
4.715
true
false
false
false
0.716432
0.181277
18.127679
0.335978
7.617387
0
0
0.25
0
0.349719
4.08151
0.144531
4.947917
false
false
2024-12-13
2024-12-17
0
bhuvneshsaini/merged_model
bigcode_starcoder2-15b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Starcoder2ForCausalLM
<a target="_blank" href="https://huggingface.co/bigcode/starcoder2-15b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigcode/starcoder2-15b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bigcode__starcoder2-15b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bigcode/starcoder2-15b
46d44742909c03ac8cee08eb03fdebce02e193ec
12.539175
bigcode-openrail-m
595
15.958
true
false
false
false
67.686269
0.278022
27.802231
0.444796
20.373541
0.059668
5.966767
0.27349
3.131991
0.350094
2.928385
0.235289
15.032137
false
true
2024-02-20
2024-06-09
0
bigcode/starcoder2-15b
bigcode_starcoder2-3b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Starcoder2ForCausalLM
<a target="_blank" href="https://huggingface.co/bigcode/starcoder2-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigcode/starcoder2-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bigcode__starcoder2-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bigcode/starcoder2-3b
733247c55e3f73af49ce8e9c7949bf14af205928
6.549148
bigcode-openrail-m
169
3.03
true
false
false
false
0.893258
0.203708
20.370838
0.350871
8.909299
0.015106
1.510574
0.244128
0
0.343458
1.432292
0.163647
7.071882
false
true
2023-11-29
2024-06-09
0
bigcode/starcoder2-3b
bigcode_starcoder2-7b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Starcoder2ForCausalLM
<a target="_blank" href="https://huggingface.co/bigcode/starcoder2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigcode/starcoder2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bigcode__starcoder2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bigcode/starcoder2-7b
a3d33687b51284b528abeb17830776ffd24892a9
8.293438
bigcode-openrail-m
172
7.174
true
false
false
false
1.012803
0.220919
22.091938
0.366099
11.39511
0.030967
3.096677
0.251678
0.223714
0.379333
5.816667
0.164229
7.136525
false
true
2024-02-20
2024-06-09
0
bigcode/starcoder2-7b
bigscience_bloom-1b1_bfloat16
bfloat16
🟢 pretrained
🟢
Original
BloomForCausalLM
<a target="_blank" href="https://huggingface.co/bigscience/bloom-1b1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigscience/bloom-1b1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bigscience__bloom-1b1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bigscience/bloom-1b1
eb3dd7399312f5f94fd13f41d2f318117d3eb1e4
4.025156
bigscience-bloom-rail-1.0
62
1.065
true
false
false
false
1.434043
0.137338
13.733782
0.310728
4.042705
0.005287
0.528701
0.259228
1.230425
0.37
3.416667
0.110788
1.198655
false
true
2022-05-19
2024-06-13
0
bigscience/bloom-1b1
bigscience_bloom-1b7_bfloat16
bfloat16
🟢 pretrained
🟢
Original
BloomForCausalLM
<a target="_blank" href="https://huggingface.co/bigscience/bloom-1b7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigscience/bloom-1b7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bigscience__bloom-1b7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bigscience/bloom-1b7
cc72a88036c2fb937d65efeacc57a0c2ef5d6fe5
4.046754
bigscience-bloom-rail-1.0
121
1.722
true
false
false
false
1.636719
0.10439
10.438969
0.314055
4.397453
0.005287
0.528701
0.258389
1.118568
0.388573
6.838281
0.108627
0.958555
false
true
2022-05-19
2024-06-13
0
bigscience/bloom-1b7
bigscience_bloom-3b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
BloomForCausalLM
<a target="_blank" href="https://huggingface.co/bigscience/bloom-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigscience/bloom-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bigscience__bloom-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bigscience/bloom-3b
52bc5b43010b4844513826b8be3f78c7344c37d7
4.387894
bigscience-bloom-rail-1.0
90
3.003
true
false
false
false
1.992112
0.127096
12.709611
0.306292
3.420098
0.008308
0.830816
0.239933
0
0.398063
7.891146
0.113281
1.475694
false
true
2022-05-19
2024-06-13
0
bigscience/bloom-3b
bigscience_bloom-560m_bfloat16
bfloat16
🟢 pretrained
🟢
Original
BloomForCausalLM
<a target="_blank" href="https://huggingface.co/bigscience/bloom-560m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigscience/bloom-560m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bigscience__bloom-560m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bigscience/bloom-560m
ac2ae5fab2ce3f9f40dc79b5ca9f637430d24971
3.507244
bigscience-bloom-rail-1.0
351
0.559
true
false
false
false
1.525432
0.062024
6.202432
0.302595
2.885364
0.003776
0.377644
0.261745
1.565996
0.403083
8.185417
0.116439
1.826611
false
true
2022-05-19
2024-06-13
0
bigscience/bloom-560m
bigscience_bloom-7b1_float16
float16
🟢 pretrained
🟢
Original
BloomForCausalLM
<a target="_blank" href="https://huggingface.co/bigscience/bloom-7b1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigscience/bloom-7b1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bigscience__bloom-7b1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bigscience/bloom-7b1
6232703e399354503377bf59dfbb8397fd569e4a
3.79551
bigscience-bloom-rail-1.0
203
7.069
true
false
false
false
2.01155
0.132217
13.221696
0.311372
4.038809
0.005287
0.528701
0.264262
1.901566
0.348698
1.920573
0.110455
1.161717
false
true
2022-05-19
2024-06-13
0
bigscience/bloom-7b1
bluuwhale_L3-SthenoMaid-8B-V1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bluuwhale/L3-SthenoMaid-8B-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bluuwhale/L3-SthenoMaid-8B-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bluuwhale__L3-SthenoMaid-8B-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bluuwhale/L3-SthenoMaid-8B-V1
f8e65823aa02752c9c08aa69c7a24bfa94058a9b
25.83976
3
8.03
false
false
false
true
1.205451
0.73447
73.447009
0.521876
32.398153
0.108006
10.800604
0.280201
4.026846
0.368698
4.853906
0.365608
29.512042
false
false
2024-06-09
2025-01-07
1
bluuwhale/L3-SthenoMaid-8B-V1 (Merge)
bond005_meno-tiny-0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bond005/meno-tiny-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bond005/meno-tiny-0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bond005__meno-tiny-0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bond005/meno-tiny-0.1
e45b5605e2209a143c823f4e9c7c49705955cdb1
18.850917
apache-2.0
10
1.544
true
false
false
true
3.733376
0.454976
45.497613
0.426291
19.642709
0.138973
13.897281
0.281879
4.250559
0.418458
9.973958
0.27859
19.843381
false
false
2024-11-18
2025-01-11
1
bond005/meno-tiny-0.1 (Merge)
bosonai_Higgs-Llama-3-70B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bosonai/Higgs-Llama-3-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bosonai/Higgs-Llama-3-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bosonai__Higgs-Llama-3-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bosonai/Higgs-Llama-3-70B
b2c7540768046dfdae7a0cb846a7da6c41d826b1
33.525398
other
220
70.554
true
false
false
true
27.453694
0.556068
55.60679
0.625766
45.897406
0.252266
25.226586
0.366611
15.548098
0.447083
15.51875
0.490193
43.354758
false
false
2024-06-05
2024-09-18
1
meta-llama/Meta-Llama-3-70B
braindao_DeepSeek-R1-Distill-Qwen-1.5B-Blunt_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/braindao/DeepSeek-R1-Distill-Qwen-1.5B-Blunt" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">braindao/DeepSeek-R1-Distill-Qwen-1.5B-Blunt</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/braindao__DeepSeek-R1-Distill-Qwen-1.5B-Blunt-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
braindao/DeepSeek-R1-Distill-Qwen-1.5B-Blunt
ef4f5c21d64e3dc2f67718e62d814827d19188ee
7.552805
0
1.777
false
false
false
true
0.603658
0.261136
26.113601
0.277437
1.102363
0.138218
13.821752
0.247483
0
0.359521
2.240104
0.118351
2.039007
false
false
2025-02-20
2025-02-20
0
braindao/DeepSeek-R1-Distill-Qwen-1.5B-Blunt
braindao_DeepSeek-R1-Distill-Qwen-1.5B-Reflective_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/braindao/DeepSeek-R1-Distill-Qwen-1.5B-Reflective" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">braindao/DeepSeek-R1-Distill-Qwen-1.5B-Reflective</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/braindao__DeepSeek-R1-Distill-Qwen-1.5B-Reflective-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
braindao/DeepSeek-R1-Distill-Qwen-1.5B-Reflective
0444175d262f264654af671f423ca72f4d284478
8.705957
0
1.777
false
false
false
true
0.622834
0.303276
30.327642
0.290844
1.74776
0.163142
16.314199
0.260906
1.454139
0.335552
0.94401
0.113032
1.447991
false
false
2025-02-20
2025-02-20
0
braindao/DeepSeek-R1-Distill-Qwen-1.5B-Reflective
braindao_DeepSeek-R1-Distill-Qwen-14B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/braindao/DeepSeek-R1-Distill-Qwen-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">braindao/DeepSeek-R1-Distill-Qwen-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/braindao__DeepSeek-R1-Distill-Qwen-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
braindao/DeepSeek-R1-Distill-Qwen-14B
00463c8b956246647d2d0fc4027df4e569194aeb
13.942561
mit
0
14.77
true
false
false
true
1.897892
0.417158
41.715759
0.303297
3.271233
0.175982
17.598187
0.280201
4.026846
0.448792
15.632292
0.112699
1.411052
false
false
2025-03-04
2025-03-04
0
braindao/DeepSeek-R1-Distill-Qwen-14B
braindao_DeepSeek-R1-Distill-Qwen-14B-ABUB-ST_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/braindao/DeepSeek-R1-Distill-Qwen-14B-ABUB-ST" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">braindao/DeepSeek-R1-Distill-Qwen-14B-ABUB-ST</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/braindao__DeepSeek-R1-Distill-Qwen-14B-ABUB-ST-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
braindao/DeepSeek-R1-Distill-Qwen-14B-ABUB-ST
2ebe36e929737546f03ef37e845c745f4068752f
29.311295
apache-2.0
0
14.77
true
false
false
true
1.891297
0.375192
37.519227
0.49269
27.634827
0.501511
50.151057
0.344799
12.639821
0.422063
11.891146
0.424285
36.031693
false
false
2025-03-12
2025-03-12
1
braindao/DeepSeek-R1-Distill-Qwen-14B-Blunt-Uncensored-Blunt
braindao_DeepSeek-R1-Distill-Qwen-14B-Blunt_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/braindao/DeepSeek-R1-Distill-Qwen-14B-Blunt" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">braindao/DeepSeek-R1-Distill-Qwen-14B-Blunt</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/braindao__DeepSeek-R1-Distill-Qwen-14B-Blunt-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
braindao/DeepSeek-R1-Distill-Qwen-14B-Blunt
ad0d32e834c1187b5894d742215fd864e94dd3ac
17.917999
0
14.77
false
false
false
true
1.936472
0.561163
56.116327
0.32829
6.126853
0.163897
16.389728
0.302852
7.04698
0.455427
16.861719
0.144697
4.966386
false
false
2025-02-26
2025-03-03
0
braindao/DeepSeek-R1-Distill-Qwen-14B-Blunt
braindao_DeepSeek-R1-Distill-Qwen-14B-Blunt-Uncensored_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/braindao/DeepSeek-R1-Distill-Qwen-14B-Blunt-Uncensored" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">braindao/DeepSeek-R1-Distill-Qwen-14B-Blunt-Uncensored</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/braindao__DeepSeek-R1-Distill-Qwen-14B-Blunt-Uncensored-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
braindao/DeepSeek-R1-Distill-Qwen-14B-Blunt-Uncensored
027525b1379cf8dd8272c46537f55281d8cdd9b9
16.624426
0
14.77
false
false
false
true
1.954473
0.542179
54.21792
0.317034
4.607188
0.163142
16.314199
0.282718
4.362416
0.448698
15.453906
0.143118
4.790928
false
false
2025-02-26
2025-03-06
0
braindao/DeepSeek-R1-Distill-Qwen-14B-Blunt-Uncensored
braindao_DeepSeek-R1-Distill-Qwen-14B-Blunt-Uncensored-Blunt_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/braindao/DeepSeek-R1-Distill-Qwen-14B-Blunt-Uncensored-Blunt" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">braindao/DeepSeek-R1-Distill-Qwen-14B-Blunt-Uncensored-Blunt</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/braindao__DeepSeek-R1-Distill-Qwen-14B-Blunt-Uncensored-Blunt-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
braindao/DeepSeek-R1-Distill-Qwen-14B-Blunt-Uncensored-Blunt
ad287e756594c171e78745b3314e9d146cd3a158
17.942347
0
14.77
false
false
false
true
1.929295
0.522146
52.214568
0.319858
5.067571
0.250755
25.075529
0.278523
3.803132
0.452698
16.120573
0.148354
5.37271
false
false
2025-02-27
2025-03-03
0
braindao/DeepSeek-R1-Distill-Qwen-14B-Blunt-Uncensored-Blunt
braindao_DeepSeek-R1-Distill-Qwen-14B-Blunt-Uncensored-Blunt-Reflective_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/braindao/DeepSeek-R1-Distill-Qwen-14B-Blunt-Uncensored-Blunt-Reflective" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">braindao/DeepSeek-R1-Distill-Qwen-14B-Blunt-Uncensored-Blunt-Reflective</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/braindao__DeepSeek-R1-Distill-Qwen-14B-Blunt-Uncensored-Blunt-Reflective-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
braindao/DeepSeek-R1-Distill-Qwen-14B-Blunt-Uncensored-Blunt-Reflective
d048407437a3af102d59bb7f60db0bbe1c6b3bd8
17.874122
0
14.77
false
false
false
true
1.887683
0.554044
55.404438
0.337106
6.901046
0.23716
23.716012
0.277685
3.691275
0.42476
11.928385
0.150432
5.603576
false
false
2025-03-05
2025-03-05
0
braindao/DeepSeek-R1-Distill-Qwen-14B-Blunt-Uncensored-Blunt-Reflective
braindao_DeepSeek-R1-Distill-Qwen-14B-Blunt-Uncensored-Reflective_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/braindao/DeepSeek-R1-Distill-Qwen-14B-Blunt-Uncensored-Reflective" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">braindao/DeepSeek-R1-Distill-Qwen-14B-Blunt-Uncensored-Reflective</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/braindao__DeepSeek-R1-Distill-Qwen-14B-Blunt-Uncensored-Reflective-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
braindao/DeepSeek-R1-Distill-Qwen-14B-Blunt-Uncensored-Reflective
7e662880a6327fd3ef8906358c526edfd48b861b
15.258011
0
14.77
false
false
false
true
1.929412
0.513927
51.392749
0.301344
2.698522
0.147281
14.728097
0.287752
5.033557
0.443333
14.483333
0.128906
3.211806
false
false
2025-03-05
2025-03-05
0
braindao/DeepSeek-R1-Distill-Qwen-14B-Blunt-Uncensored-Reflective
braindao_DeepSeek-R1-Distill-Qwen-14B-Reflective_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/braindao/DeepSeek-R1-Distill-Qwen-14B-Reflective" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">braindao/DeepSeek-R1-Distill-Qwen-14B-Reflective</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/braindao__DeepSeek-R1-Distill-Qwen-14B-Reflective-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
braindao/DeepSeek-R1-Distill-Qwen-14B-Reflective
ad501b68fd42c312c64c70b1f63a9562679711c2
14.389854
0
14.77
false
false
false
true
1.992267
0.429023
42.902277
0.301226
3.035855
0.191843
19.18429
0.272651
3.020134
0.455396
16.757813
0.112949
1.438756
false
false
2025-02-26
2025-03-03
0
braindao/DeepSeek-R1-Distill-Qwen-14B-Reflective
braindao_DeepSeek-R1-Distill-Qwen-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/braindao/DeepSeek-R1-Distill-Qwen-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">braindao/DeepSeek-R1-Distill-Qwen-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/braindao__DeepSeek-R1-Distill-Qwen-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
braindao/DeepSeek-R1-Distill-Qwen-7B
b4409922f58a06ce7296032b7baba2eaa45a4000
11.3986
mit
0
7.616
true
false
false
true
0.681426
0.396799
39.679938
0.288678
1.976671
0.191843
19.18429
0.261745
1.565996
0.376667
4.416667
0.114112
1.568041
false
false
2025-03-04
2025-03-04
0
braindao/DeepSeek-R1-Distill-Qwen-7B
braindao_DeepSeek-R1-Distill-Qwen-7B-Blunt_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/braindao/DeepSeek-R1-Distill-Qwen-7B-Blunt" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">braindao/DeepSeek-R1-Distill-Qwen-7B-Blunt</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/braindao__DeepSeek-R1-Distill-Qwen-7B-Blunt-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
braindao/DeepSeek-R1-Distill-Qwen-7B-Blunt
fecc50fce2183143952083d053685194640c2ad1
12.85078
0
7.616
false
false
false
true
0.672693
0.426625
42.662469
0.290178
2.149819
0.214502
21.450151
0.270973
2.796421
0.38851
6.163802
0.116938
1.882018
false
false
2025-02-20
2025-03-03
0
braindao/DeepSeek-R1-Distill-Qwen-7B-Blunt
braindao_DeepSeek-R1-Distill-Qwen-7B-ORPO-Uncensored_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/braindao/DeepSeek-R1-Distill-Qwen-7B-ORPO-Uncensored" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">braindao/DeepSeek-R1-Distill-Qwen-7B-ORPO-Uncensored</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/braindao__DeepSeek-R1-Distill-Qwen-7B-ORPO-Uncensored-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
braindao/DeepSeek-R1-Distill-Qwen-7B-ORPO-Uncensored
1a94375fd2c68af28d112348ad5b8b22cb1b3c2f
10.721035
apache-2.0
0
7.616
true
false
false
true
0.678523
0.36545
36.545034
0.295844
2.744263
0.173716
17.371601
0.253356
0.447427
0.384604
5.742188
0.113281
1.475694
false
false
2025-03-13
2025-03-13
2
deepseek-ai/DeepSeek-R1-Distill-Qwen-7B
braindao_DeepSeek-R1-Distill-Qwen-7B-Reflective_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/braindao/DeepSeek-R1-Distill-Qwen-7B-Reflective" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">braindao/DeepSeek-R1-Distill-Qwen-7B-Reflective</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/braindao__DeepSeek-R1-Distill-Qwen-7B-Reflective-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
braindao/DeepSeek-R1-Distill-Qwen-7B-Reflective
62714830b4f061e1f132b19ad1b95db46653f8c2
11.718078
0
7.616
false
false
false
true
0.668468
0.392178
39.217831
0.290678
2.0813
0.202417
20.241692
0.254195
0.559284
0.39
6.483333
0.115525
1.72503
false
false
2025-02-26
2025-03-03
0
braindao/DeepSeek-R1-Distill-Qwen-7B-Reflective
braindao_Qwen2.5-14B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/braindao/Qwen2.5-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">braindao/Qwen2.5-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/braindao__Qwen2.5-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
braindao/Qwen2.5-14B
ca710582977c49cc263b4fd02de159f9c51dc76c
32.436239
apache-2.0
0
14.77
true
false
false
true
2.251569
0.540855
54.085493
0.585266
41.263514
0.292296
29.229607
0.373322
16.442953
0.412354
10.444271
0.488364
43.151596
false
false
2025-03-06
2025-03-06
0
braindao/Qwen2.5-14B
braindao_Qwen2.5-14B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/braindao/Qwen2.5-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">braindao/Qwen2.5-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/braindao__Qwen2.5-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
braindao/Qwen2.5-14B-Instruct
f45a202018ab4796db270d79f5f2b193237d37fb
41.60549
apache-2.0
0
14.77
true
false
false
true
4.755065
0.814254
81.425396
0.640364
48.57309
0.55287
55.287009
0.328859
10.514541
0.414
10.616667
0.488946
43.216238
false
false
2025-03-06
2025-03-06
1
Qwen/Qwen2.5-14B
braindao_iq-code-evmind-0.5b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/braindao/iq-code-evmind-0.5b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">braindao/iq-code-evmind-0.5b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/braindao__iq-code-evmind-0.5b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
braindao/iq-code-evmind-0.5b
c34cdd02ff7488eea0ac26110b0b7cb277a0bf1b
7.022414
apache-2.0
2
0.494
true
false
false
true
0.507363
0.321561
32.156124
0.316374
4.260915
0.024169
2.416918
0.241611
0
0.330375
1.196875
0.118933
2.10365
false
false
2024-12-23
2025-02-20
1
braindao/iq-code-evmind-0.5b-instruct-v0.2411.4
brgx53_3Bgeneral-ECE-PRYMMAL-Martial_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/brgx53/3Bgeneral-ECE-PRYMMAL-Martial" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">brgx53/3Bgeneral-ECE-PRYMMAL-Martial</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/brgx53__3Bgeneral-ECE-PRYMMAL-Martial-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
brgx53/3Bgeneral-ECE-PRYMMAL-Martial
78ee3bde02df349ee7161f9c2a5b36161c294009
23.281187
apache-2.0
0
3.821
true
false
false
false
1.307304
0.328931
32.893057
0.545801
36.673582
0.13142
13.141994
0.324664
9.955257
0.437281
14.426823
0.393368
32.59641
true
false
2024-10-23
2024-10-23
1
brgx53/3Bgeneral-ECE-PRYMMAL-Martial (Merge)
brgx53_3Bgeneralv2-ECE-PRYMMAL-Martial_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/brgx53/3Bgeneralv2-ECE-PRYMMAL-Martial" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">brgx53/3Bgeneralv2-ECE-PRYMMAL-Martial</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/brgx53__3Bgeneralv2-ECE-PRYMMAL-Martial-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
brgx53/3Bgeneralv2-ECE-PRYMMAL-Martial
8525f801c47b2bce2ca4dad360ce71b2cb6b370b
31.482397
apache-2.0
15
3
true
false
false
false
2.00682
0.567708
56.770813
0.56072
37.250633
0.349698
34.969789
0.311242
8.165548
0.435635
12.78776
0.450549
38.949837
true
false
2024-11-08
2024-11-08
1
brgx53/3Bgeneralv2-ECE-PRYMMAL-Martial (Merge)
brgx53_3Blareneg-ECE-PRYMMAL-Martial_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/brgx53/3Blareneg-ECE-PRYMMAL-Martial" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">brgx53/3Blareneg-ECE-PRYMMAL-Martial</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/brgx53__3Blareneg-ECE-PRYMMAL-Martial-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
brgx53/3Blareneg-ECE-PRYMMAL-Martial
abac4757125a66a427fb82751bf171dabaea3458
22.756317
apache-2.0
0
3.821
true
false
false
false
1.615069
0.287639
28.763902
0.535846
35.452586
0.120846
12.084592
0.334732
11.297539
0.442896
15.428646
0.401596
33.510638
true
false
2024-10-23
2024-10-23
1
brgx53/3Blareneg-ECE-PRYMMAL-Martial (Merge)
brgx53_3Blarenegv2-ECE-PRYMMAL-Martial_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/brgx53/3Blarenegv2-ECE-PRYMMAL-Martial" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">brgx53/3Blarenegv2-ECE-PRYMMAL-Martial</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/brgx53__3Blarenegv2-ECE-PRYMMAL-Martial-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
brgx53/3Blarenegv2-ECE-PRYMMAL-Martial
304038fc2b2527e31c738f9091206253a0d40f6c
31.457001
apache-2.0
0
7.616
true
false
false
false
1.373792
0.566184
56.618439
0.56072
37.250633
0.349698
34.969789
0.311242
8.165548
0.435635
12.78776
0.450549
38.949837
true
false
2024-11-08
2024-11-08
1
brgx53/3Blarenegv2-ECE-PRYMMAL-Martial (Merge)
brgx53_Barracuda-PRYMMAL-ECE-TW3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/brgx53/Barracuda-PRYMMAL-ECE-TW3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">brgx53/Barracuda-PRYMMAL-ECE-TW3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/brgx53__Barracuda-PRYMMAL-ECE-TW3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
brgx53/Barracuda-PRYMMAL-ECE-TW3
5b24379a24328b77300eca1540915408151a9f20
3.916928
0
1.544
false
false
false
false
0.62954
0.164016
16.401592
0.300246
2.753427
0.002266
0.226586
0.253356
0.447427
0.360854
2.640104
0.109292
1.032432
false
false
2025-03-10
2025-03-10
0
brgx53/Barracuda-PRYMMAL-ECE-TW3
brgx53_LaConfiance-PRYMMAL-ECE-TW3_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/brgx53/LaConfiance-PRYMMAL-ECE-TW3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">brgx53/LaConfiance-PRYMMAL-ECE-TW3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/brgx53__LaConfiance-PRYMMAL-ECE-TW3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
brgx53/LaConfiance-PRYMMAL-ECE-TW3
09165d67f26be2a1bc6a319424fc2f35b1faf840
4.255169
0
1.777
false
false
false
false
0.607411
0.157921
15.792098
0.296242
1.986805
0
0
0.251678
0.223714
0.384573
5.904948
0.114611
1.623449
false
false
2025-03-10
2025-03-10
0
brgx53/LaConfiance-PRYMMAL-ECE-TW3
bunnycore_Best-Mix-Llama-3.1-8B_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Best-Mix-Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Best-Mix-Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Best-Mix-Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Best-Mix-Llama-3.1-8B
4bde0e60ac20d6944b1fbdfb3456efea8ba59ae9
9.644596
0
8.03
false
false
false
false
1.810005
0.206706
20.670598
0.343178
7.255276
0.205438
20.543807
0.265101
2.013423
0.292854
1.106771
0.156499
6.277704
false
false
2024-10-10
0
Removed
bunnycore_Blabbertron-1.0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Blabbertron-1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Blabbertron-1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Blabbertron-1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Blabbertron-1.0
329e37a8e0c1e6289418ec00ee3895315adae416
36.224715
3
7.613
false
false
false
true
0.671961
0.743338
74.333768
0.549655
36.054612
0.492447
49.244713
0.302013
6.935123
0.433688
13.510938
0.435422
37.269134
false
false
2025-03-03
2025-03-03
1
bunnycore/Blabbertron-1.0 (Merge)
bunnycore_Blabbertron-1.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Blabbertron-1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Blabbertron-1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Blabbertron-1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Blabbertron-1.1
bfa4acda0123e6579f9460441c65e1c80f22762f
36.192888
1
7.613
false
false
false
true
0.663739
0.726527
72.652673
0.5534
36.60739
0.480363
48.036254
0.302852
7.04698
0.441563
14.695313
0.443068
38.11872
false
false
2025-03-04
2025-03-04
1
bunnycore/Blabbertron-1.1 (Merge)
bunnycore_CyberCore-Qwen-2.1-7B_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/CyberCore-Qwen-2.1-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/CyberCore-Qwen-2.1-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__CyberCore-Qwen-2.1-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/CyberCore-Qwen-2.1-7B
98e69ba1cd70444b90178e1253e904d1892593c8
30.984211
2
7.616
false
false
false
true
1.345368
0.576576
57.657571
0.557209
36.966533
0.358761
35.876133
0.307886
7.718121
0.41449
9.411198
0.444481
38.275709
false
false
2024-11-21
2024-11-23
1
bunnycore/CyberCore-Qwen-2.1-7B (Merge)
bunnycore_DeepQwen-3B-LCoT-SCE_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/DeepQwen-3B-LCoT-SCE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/DeepQwen-3B-LCoT-SCE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__DeepQwen-3B-LCoT-SCE-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/DeepQwen-3B-LCoT-SCE
200ae320fb7846016990e58b46acf78ce0a3b946
20.33846
1
3.396
false
false
false
true
0.773439
0.448981
44.898093
0.451231
23.559546
0.246979
24.697885
0.262584
1.677852
0.351396
1.757812
0.328956
25.439569
false
false
2025-02-23
2025-02-26
1
bunnycore/DeepQwen-3B-LCoT-SCE (Merge)
bunnycore_DeepSeek-R1-Distill-Qwen-7B-RRP-Ex_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/DeepSeek-R1-Distill-Qwen-7B-RRP-Ex" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/DeepSeek-R1-Distill-Qwen-7B-RRP-Ex</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__DeepSeek-R1-Distill-Qwen-7B-RRP-Ex-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/DeepSeek-R1-Distill-Qwen-7B-RRP-Ex
cd233008ad60c15ed06f7de327e11f0734432b85
14.61649
0
7.616
false
false
false
true
1.336502
0.390105
39.010492
0.349411
8.396454
0.165408
16.540785
0.278523
3.803132
0.366313
3.189063
0.250831
16.759013
false
false
2025-01-27
0
Removed
bunnycore_DeepThinker-7B-Sce-v1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/DeepThinker-7B-Sce-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/DeepThinker-7B-Sce-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__DeepThinker-7B-Sce-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/DeepThinker-7B-Sce-v1
93d03ae7c0059068cdd1cbfbbca6f3822d699419
4.766577
0
7.613
false
false
false
true
1.425926
0.1218
12.180016
0.301828
2.520598
0.009819
0.981873
0.251678
0.223714
0.419427
11.328385
0.112284
1.364879
false
false
2025-02-06
0
Removed
bunnycore_DeepThinker-7B-Sce-v2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/DeepThinker-7B-Sce-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/DeepThinker-7B-Sce-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__DeepThinker-7B-Sce-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/DeepThinker-7B-Sce-v2
7be8b128a7ce3a065aa6aa6518dc3d0c3c4c24ff
5.521537
3
7.613
false
false
false
true
1.424968
0.163066
16.306622
0.305684
3.256507
0.011329
1.132931
0.258389
1.118568
0.410063
9.691146
0.114611
1.623449
false
false
2025-02-06
2025-02-06
1
bunnycore/DeepThinker-7B-Sce-v2 (Merge)
bunnycore_FuseCyberMix-Qwen-2.5-7B-Instruct_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/FuseCyberMix-Qwen-2.5-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/FuseCyberMix-Qwen-2.5-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__FuseCyberMix-Qwen-2.5-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/FuseCyberMix-Qwen-2.5-7B-Instruct
e8cb3470dfa0c8e1d9f661168027241a0908876f
34.505747
2
7.616
false
false
false
true
1.302795
0.701922
70.192201
0.551797
36.368619
0.484139
48.413897
0.29698
6.263982
0.402031
8.720573
0.433677
37.075207
false
false
2024-12-20
2024-12-20
1
bunnycore/FuseCyberMix-Qwen-2.5-7B-Instruct (Merge)
bunnycore_FuseQwQen-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/FuseQwQen-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/FuseQwQen-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__FuseQwQen-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/FuseQwQen-7B
18b2a6249bdef2ff53d112b8008a7b8c3b8b9778
34.677867
3
7.616
false
false
false
true
1.431779
0.727451
72.745094
0.550426
35.909589
0.436556
43.655589
0.294463
5.928412
0.421688
11.977604
0.440658
37.850916
false
false
2024-12-22
2024-12-22
1
bunnycore/FuseQwQen-7B (Merge)
bunnycore_FwF-Qwen-7B-0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/FwF-Qwen-7B-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/FwF-Qwen-7B-0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__FwF-Qwen-7B-0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/FwF-Qwen-7B-0.1
7a6e2f5aac25da186af9a35ca7b10cad1f0f8e40
22.055184
1
7.616
false
false
false
true
1.642909
0.300454
30.045391
0.501927
30.502106
0.276435
27.643505
0.270973
2.796421
0.395208
7.334375
0.406084
34.009309
false
false
2025-01-06
2025-01-06
1
bunnycore/FwF-Qwen-7B-0.1 (Merge)
bunnycore_FwF-Qwen-7B-0.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/FwF-Qwen-7B-0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/FwF-Qwen-7B-0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__FwF-Qwen-7B-0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/FwF-Qwen-7B-0.2
f328f22194bfe9586d7dc7bb671af45922e068c7
30.049607
5
7.616
false
false
false
true
1.550555
0.447907
44.790711
0.559641
37.66718
0.425982
42.598187
0.290268
5.369128
0.421781
12.289323
0.438248
37.583112
false
false
2025-01-07
2025-01-15
1
bunnycore/FwF-Qwen-7B-0.2 (Merge)
bunnycore_Gemma-2-2B-Smart_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Gemma-2-2B-Smart" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Gemma-2-2B-Smart</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Gemma-2-2B-Smart-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Gemma-2-2B-Smart
426fc5f77a0f217150567a10e7fec5234cafa29b
10.674609
0
2.614
false
false
false
true
2.481797
0.132066
13.206625
0.397427
15.070459
0.033233
3.323263
0.282718
4.362416
0.424854
12.240104
0.242603
15.844784
false
false
2025-01-13
2025-01-13
1
bunnycore/Gemma-2-2B-Smart (Merge)
bunnycore_Gemma2-9B-TitanFusion_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Gemma2-9B-TitanFusion" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Gemma2-9B-TitanFusion</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Gemma2-9B-TitanFusion-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Gemma2-9B-TitanFusion
e2bb7d187d8dba7488fc134af45ca9b3139adfb4
19.471501
1
10.159
false
false
false
true
2.046713
0.161842
16.184169
0.571203
39.050564
0.077039
7.703927
0.332215
10.961969
0.413625
10.036458
0.396027
32.891918
false
false
2024-09-17
2025-02-13
1
bunnycore/Gemma2-9B-TitanFusion (Merge)
bunnycore_HyperLlama-3.1-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/HyperLlama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/HyperLlama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__HyperLlama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/HyperLlama-3.1-8B
659b18ffaee2c1e8dbe8a9a56a44502325d71696
28.448976
apache-2.0
5
8.03
true
false
false
true
1.789089
0.788301
78.83006
0.510339
29.806656
0.182779
18.277946
0.286913
4.9217
0.382927
7.932552
0.378324
30.924941
true
false
2024-09-04
2024-09-05
0
bunnycore/HyperLlama-3.1-8B
bunnycore_Llama-3.1-8B-TitanFusion-Mix_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Llama-3.1-8B-TitanFusion-Mix" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Llama-3.1-8B-TitanFusion-Mix</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Llama-3.1-8B-TitanFusion-Mix-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Llama-3.1-8B-TitanFusion-Mix
9eb89de7df048276ccbc4405ce4f005f9185f82e
25.012248
2
8.03
false
false
false
false
1.86617
0.492495
49.249547
0.575596
39.535483
0.128399
12.839879
0.295302
6.040268
0.431698
12.46224
0.369515
29.94607
false
false
2024-09-23
2024-09-23
1
bunnycore/Llama-3.1-8B-TitanFusion-Mix (Merge)
bunnycore_Llama-3.1-8B-TitanFusion-v3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Llama-3.1-8B-TitanFusion-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Llama-3.1-8B-TitanFusion-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Llama-3.1-8B-TitanFusion-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Llama-3.1-8B-TitanFusion-v3
ea8269ac3b2e9c0dc855a9089251ebdb273ada16
24.219133
2
8.03
false
false
false
false
1.775648
0.480955
48.095498
0.526211
32.072941
0.141994
14.199396
0.308725
7.829978
0.430208
11.942708
0.380568
31.174276
false
false
2024-09-22
2024-09-22
1
bunnycore/Llama-3.1-8B-TitanFusion-v3 (Merge)
bunnycore_Llama-3.2-3B-All-Mix_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Llama-3.2-3B-All-Mix" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Llama-3.2-3B-All-Mix</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Llama-3.2-3B-All-Mix-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Llama-3.2-3B-All-Mix
adacdd571c4073990ecf05a23277793e9e5f0410
22.945179
2
3.607
false
false
false
true
1.480619
0.722605
72.260491
0.450834
22.516311
0.150302
15.030211
0.262584
1.677852
0.328698
2.18724
0.315991
23.998966
false
false
2024-10-20
2024-10-20
1
bunnycore/Llama-3.2-3B-All-Mix (Merge)
bunnycore_Llama-3.2-3B-Bespoke-Thought_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Llama-3.2-3B-Bespoke-Thought" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Llama-3.2-3B-Bespoke-Thought</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Llama-3.2-3B-Bespoke-Thought-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Llama-3.2-3B-Bespoke-Thought
e8d08b4548da570ba29ceb4e5ea4a0a75c14377a
18.009909
3
3.213
false
false
false
true
1.22351
0.411262
41.126212
0.452174
22.516566
0.164653
16.465257
0.26594
2.12528
0.33025
2.38125
0.311004
23.444888
false
false
2025-02-02
2025-02-02
1
bunnycore/Llama-3.2-3B-Bespoke-Thought (Merge)
bunnycore_Llama-3.2-3B-Booval_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Llama-3.2-3B-Booval" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Llama-3.2-3B-Booval</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Llama-3.2-3B-Booval-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Llama-3.2-3B-Booval
d7f3449f89fa86d8e2c411aa4ca10ad552a62803
21.565451
2
3.213
false
false
false
true
1.310481
0.666926
66.692598
0.451439
22.515991
0.126888
12.688822
0.266779
2.237136
0.339427
2.395052
0.305768
22.863106
false
false
2024-10-27
2024-10-28
1
bunnycore/Llama-3.2-3B-Booval (Merge)
bunnycore_Llama-3.2-3B-Deep-Test_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Llama-3.2-3B-Deep-Test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Llama-3.2-3B-Deep-Test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Llama-3.2-3B-Deep-Test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Llama-3.2-3B-Deep-Test
cdf5651d7e39bfc6e70d4908137d103013c09109
3.973673
0
1.803
false
false
false
true
0.596528
0.17753
17.753006
0.295026
2.572323
0
0
0.251678
0.223714
0.364667
2.75
0.104887
0.542996
false
false
2025-01-02
2025-01-01
1
bunnycore/Llama-3.2-3B-Deep-Test (Merge)
bunnycore_Llama-3.2-3B-Deep-Test_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Llama-3.2-3B-Deep-Test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Llama-3.2-3B-Deep-Test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Llama-3.2-3B-Deep-Test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Llama-3.2-3B-Deep-Test
5fc2f2f533e2de433d4bf99de72745aa2e32f914
18.441057
0
3.607
false
false
false
true
1.739575
0.465168
46.516798
0.453085
22.914432
0.128399
12.839879
0.264262
1.901566
0.339396
2.557813
0.315243
23.915854
false
false
2025-01-02
2025-01-02
1
bunnycore/Llama-3.2-3B-Deep-Test (Merge)
bunnycore_Llama-3.2-3B-Della_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Llama-3.2-3B-Della" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Llama-3.2-3B-Della</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Llama-3.2-3B-Della-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Llama-3.2-3B-Della
53bc0a13b30227548abc34d4a5d7242e7a3dee74
12.217686
0
3.607
false
false
false
true
1.254836
0.356083
35.608297
0.368349
11.467459
0.030211
3.021148
0.276007
3.467562
0.390156
7.202865
0.212849
12.538785
false
false
2025-01-19
0
Removed
bunnycore_Llama-3.2-3B-Long-Think_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Llama-3.2-3B-Long-Think" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Llama-3.2-3B-Long-Think</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Llama-3.2-3B-Long-Think-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Llama-3.2-3B-Long-Think
a8522bfc03657b41b0541b164a98ddff302a6fd2
19.82599
1
3.213
false
false
false
true
2.071543
0.54735
54.734992
0.461039
24.226803
0.14577
14.577039
0.260906
1.454139
0.339552
1.210677
0.304771
22.75229
false
false
2024-10-24
2024-10-24
1
bunnycore/Llama-3.2-3B-Long-Think (Merge)
bunnycore_Llama-3.2-3B-Mix-Skill_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Llama-3.2-3B-Mix-Skill" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Llama-3.2-3B-Mix-Skill</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Llama-3.2-3B-Mix-Skill-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Llama-3.2-3B-Mix-Skill
d07d6e733aaeaf48cb6616228d00104b05b52afd
21.739566
2
3.607
false
false
false
true
1.370134
0.640423
64.042297
0.458184
23.784247
0.147281
14.728097
0.261745
1.565996
0.339615
2.751823
0.312084
23.564938
false
false
2024-10-24
2024-10-24
1
bunnycore/Llama-3.2-3B-Mix-Skill (Merge)
bunnycore_Llama-3.2-3B-ProdigyPlus_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Llama-3.2-3B-ProdigyPlus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Llama-3.2-3B-ProdigyPlus</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Llama-3.2-3B-ProdigyPlus-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Llama-3.2-3B-ProdigyPlus
799f7669701ecf27f4c3e29998dd839b4d54c408
16.356008
2
3.607
false
false
false
true
1.42788
0.40152
40.152019
0.439228
20.622989
0.115559
11.555891
0.268456
2.46085
0.358
3.15
0.281749
20.194297
false
false
2024-10-25
2024-10-25
1
bunnycore/Llama-3.2-3B-ProdigyPlus (Merge)
bunnycore_Llama-3.2-3B-ProdigyPlusPlus_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Llama-3.2-3B-ProdigyPlusPlus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Llama-3.2-3B-ProdigyPlusPlus</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Llama-3.2-3B-ProdigyPlusPlus-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Llama-3.2-3B-ProdigyPlusPlus
512865708a7ec9754997fb404b1ffc0752b099d7
6.708176
0
3.607
false
false
false
true
1.343151
0.164516
16.451571
0.368993
11.561978
0.045317
4.531722
0.253356
0.447427
0.354125
1.698958
0.150017
5.557402
false
false
2024-10-28
2024-10-28
1
bunnycore/Llama-3.2-3B-ProdigyPlusPlus (Merge)
bunnycore_Llama-3.2-3B-RP-DeepThink_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Llama-3.2-3B-RP-DeepThink" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Llama-3.2-3B-RP-DeepThink</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Llama-3.2-3B-RP-DeepThink-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Llama-3.2-3B-RP-DeepThink
b728a56e39f2ac38926b380f5327a932ff04f2e7
23.211039
2
3.607
false
false
false
true
1.32722
0.714387
71.438672
0.456256
23.757462
0.160876
16.087613
0.26594
2.12528
0.330219
0.94401
0.324219
24.913194
false
false
2024-12-27
2024-12-27
1
bunnycore/Llama-3.2-3B-RP-DeepThink (Merge)
bunnycore_Llama-3.2-3B-RRStock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Llama-3.2-3B-RRStock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Llama-3.2-3B-RRStock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Llama-3.2-3B-RRStock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Llama-3.2-3B-RRStock
79c38a10ff5b4be3618e8cb1dad6b83a67570499
22.674689
0
3.607
false
false
false
true
1.144664
0.665727
66.572694
0.456769
23.921831
0.16994
16.993958
0.26594
2.12528
0.331427
1.595052
0.323554
24.839317
false
false
2025-02-03
2025-02-03
1
bunnycore/Llama-3.2-3B-RRStock (Merge)
bunnycore_Llama-3.2-3B-ToxicKod_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Llama-3.2-3B-ToxicKod" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Llama-3.2-3B-ToxicKod</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Llama-3.2-3B-ToxicKod-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Llama-3.2-3B-ToxicKod
5491f02af0048f1549b3eeca74bd5c5e5a675363
21.269124
2
3.213
false
false
false
true
1.210685
0.63193
63.192995
0.452543
22.983328
0.16994
16.993958
0.26594
2.12528
0.347458
1.432292
0.287982
20.886894
false
false
2025-03-10
2025-03-11
1
bunnycore/Llama-3.2-3B-ToxicKod (Merge)
bunnycore_Llama-3.2-3b-RP-Toxic-Fuse_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Llama-3.2-3b-RP-Toxic-Fuse" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Llama-3.2-3b-RP-Toxic-Fuse</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Llama-3.2-3b-RP-Toxic-Fuse-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Llama-3.2-3b-RP-Toxic-Fuse
ca68bc3095297b349bd87e6eed2e419fcf32fbe8
25.277382
2
3.213
false
false
false
true
0.578951
0.683362
68.336237
0.464972
24.366031
0.240181
24.018127
0.277685
3.691275
0.395365
7.853906
0.310588
23.398715
false
false
2025-03-11
2025-03-11
1
bunnycore/Llama-3.2-3b-RP-Toxic-Fuse (Merge)
bunnycore_Maestro-S1k-7B-Sce_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Maestro-S1k-7B-Sce" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Maestro-S1k-7B-Sce</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Maestro-S1k-7B-Sce-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Maestro-S1k-7B-Sce
a0acad45ee6dd3cac16f569a5f86497258643bc0
6.835618
1
7.613
false
false
false
true
0.723416
0.252268
25.226843
0.310438
4.844047
0.027946
2.794562
0.260906
1.454139
0.376823
4.802865
0.117021
1.891253
false
false
2025-02-20
2025-02-21
1
bunnycore/Maestro-S1k-7B-Sce (Merge)
bunnycore_Phi-3.5-mini-TitanFusion-0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Phi-3.5-mini-TitanFusion-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Phi-3.5-mini-TitanFusion-0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Phi-3.5-mini-TitanFusion-0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Phi-3.5-mini-TitanFusion-0.1
72939b8b75e23b22b1758bb05a842e5834f75d96
26.235792
0
3.821
false
false
false
true
1.593771
0.522795
52.279507
0.537373
35.446219
0.11858
11.858006
0.331376
10.850112
0.445313
15.797396
0.380652
31.183511
false
false
2024-10-13
2024-10-13
1
bunnycore/Phi-3.5-mini-TitanFusion-0.1 (Merge)
bunnycore_Phi-4-Model-Stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Phi-4-Model-Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Phi-4-Model-Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Phi-4-Model-Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Phi-4-Model-Stock
4b7a2eafbf33e8cf7552b7ed62305c292c157895
40.785716
mit
6
14.66
true
false
false
true
1.91982
0.687884
68.78837
0.68897
55.315678
0.429758
42.975831
0.354866
13.982103
0.444135
15.116927
0.536818
48.535387
true
false
2025-01-11
2025-01-11
1
bunnycore/Phi-4-Model-Stock (Merge)
bunnycore_Phi-4-Model-Stock-v2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Phi-4-Model-Stock-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Phi-4-Model-Stock-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Phi-4-Model-Stock-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Phi-4-Model-Stock-v2
e69d6350c5c930dff7afd24bfaa584e0dfff0334
39.14462
2
14.66
false
false
false
true
1.884969
0.637525
63.75251
0.682467
54.686374
0.375378
37.537764
0.348993
13.199105
0.466177
17.572135
0.533078
48.119829
false
false
2025-01-17
2025-01-17
1
bunnycore/Phi-4-Model-Stock-v2 (Merge)
bunnycore_Phi-4-Model-Stock-v3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Phi-4-Model-Stock-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Phi-4-Model-Stock-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Phi-4-Model-Stock-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Phi-4-Model-Stock-v3
abc3cbe3f4f850ea9f317595c3ee3ff8e22fe220
37.672991
3
14.66
false
false
false
true
1.861208
0.591164
59.116367
0.67263
52.783611
0.490181
49.018127
0.28943
5.257271
0.416635
11.179427
0.538148
48.683141
false
false
2025-01-18
2025-01-18
1
bunnycore/Phi-4-Model-Stock-v3 (Merge)
bunnycore_Phi-4-Model-Stock-v4_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Phi-4-Model-Stock-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Phi-4-Model-Stock-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Phi-4-Model-Stock-v4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Phi-4-Model-Stock-v4
76e0de4ca96533eaa7bfec95b9cc4caeb4e1db6b
41.216842
9
14.66
false
false
false
true
1.886843
0.711015
71.101455
0.69243
55.901736
0.382931
38.293051
0.369128
15.883669
0.461063
17.299479
0.539395
48.821661
false
false
2025-01-25
2025-01-25
1
bunnycore/Phi-4-Model-Stock-v4 (Merge)