eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
64 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.09k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.22
0.83
BBH
float64
0.25
76.7
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.7
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
525 values
Submission Date
stringclasses
263 values
Generation
int64
0
10
Base Model
stringlengths
4
102
cyberagent_calm3-22b-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/cyberagent/calm3-22b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cyberagent/calm3-22b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cyberagent__calm3-22b-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cyberagent/calm3-22b-chat
055922aa0f0fb1fbfbc97a2e31134532485ee99b
21.451118
apache-2.0
75
22.543
true
false
false
true
3.548496
0.509131
50.913133
0.499168
29.520884
0.069486
6.94864
0.276846
3.579418
0.455323
16.082031
0.294963
21.662603
false
false
2024-07-01
2024-07-04
0
cyberagent/calm3-22b-chat
darkc0de_BuddyGlassNeverSleeps_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/darkc0de/BuddyGlassNeverSleeps" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">darkc0de/BuddyGlassNeverSleeps</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/darkc0de__BuddyGlassNeverSleeps-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
darkc0de/BuddyGlassNeverSleeps
f8849498f02c94b68ef0308a7bf6637264949a7d
19.820642
2
8.03
false
false
false
false
2.708298
0.423902
42.390191
0.497723
28.477953
0.062689
6.268882
0.294463
5.928412
0.399271
8.608854
0.345246
27.249557
false
false
2024-09-16
2024-09-16
1
darkc0de/BuddyGlassNeverSleeps (Merge)
darkc0de_BuddyGlassUncensored2025.2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/darkc0de/BuddyGlassUncensored2025.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">darkc0de/BuddyGlassUncensored2025.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/darkc0de__BuddyGlassUncensored2025.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
darkc0de/BuddyGlassUncensored2025.2
e5d8aedaee374cc87d985cd76818f61f529b4476
33.625793
apache-2.0
3
10.306
true
false
false
true
1.790304
0.773113
77.311312
0.609541
43.571245
0.240181
24.018127
0.32802
10.402685
0.407083
9.385417
0.433594
37.065972
false
false
2025-01-13
2025-01-15
3
tiiuae/Falcon3-10B-Base
darkc0de_BuddyGlass_v0.3_Xortron7MethedUpSwitchedUp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/darkc0de/BuddyGlass_v0.3_Xortron7MethedUpSwitchedUp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">darkc0de/BuddyGlass_v0.3_Xortron7MethedUpSwitchedUp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/darkc0de__BuddyGlass_v0.3_Xortron7MethedUpSwitchedUp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
darkc0de/BuddyGlass_v0.3_Xortron7MethedUpSwitchedUp
57367fefe01c7d9653c303b28449b416fc777d93
22.328255
3
0.007
false
false
false
false
1.796364
0.435842
43.584245
0.524309
31.869311
0.128399
12.839879
0.298658
6.487696
0.414333
9.491667
0.367271
29.696735
false
false
2024-09-10
2024-09-15
1
darkc0de/BuddyGlass_v0.3_Xortron7MethedUpSwitchedUp (Merge)
databricks_dbrx-base_float16
float16
❓ other
Original
Unknown
<a target="_blank" href="https://huggingface.co/databricks/dbrx-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">databricks/dbrx-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/databricks__dbrx-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
databricks/dbrx-base
d7d18d833146403dd74c2620b8434639ae123d6e
16.359432
other
556
0
true
true
false
false
10.45341
0.082147
8.214724
0.519583
32.608538
0.1
10
0.326667
10.222222
0.406667
9.333333
0.35
27.777778
false
true
2024-03-26
0
databricks/dbrx-base
databricks_dbrx-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
DbrxForCausalLM
<a target="_blank" href="https://huggingface.co/databricks/dbrx-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">databricks/dbrx-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/databricks__dbrx-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
databricks/dbrx-instruct
c0a9245908c187da8f43a81e538e67ff360904ea
25.19901
other
1,111
131.597
true
false
false
true
47.958027
0.54158
54.157968
0.542896
35.96382
0.068731
6.873112
0.341443
12.192394
0.426927
12.199219
0.368268
29.80755
false
true
2024-03-26
2024-06-12
0
databricks/dbrx-instruct
databricks_dolly-v1-6b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPTJForCausalLM
<a target="_blank" href="https://huggingface.co/databricks/dolly-v1-6b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">databricks/dolly-v1-6b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/databricks__dolly-v1-6b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
databricks/dolly-v1-6b
c9a85b3a322b402e20c839c702c725afe0cb454d
6.981232
cc-by-nc-4.0
310
6
true
false
false
false
1.32156
0.222443
22.244312
0.317209
4.781309
0.018882
1.888218
0.264262
1.901566
0.400417
8.11875
0.126579
2.953236
false
true
2023-03-23
2024-06-12
0
databricks/dolly-v1-6b
databricks_dolly-v2-12b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/databricks/dolly-v2-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">databricks/dolly-v2-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/databricks__dolly-v2-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
databricks/dolly-v2-12b
19308160448536e378e3db21a73a751579ee7fdd
6.370436
mit
1,955
12
true
false
false
false
2.794239
0.235507
23.550734
0.331997
6.377894
0.013595
1.359517
0.240772
0
0.373906
5.504948
0.112866
1.429521
false
true
2023-04-11
2024-06-12
0
databricks/dolly-v2-12b
databricks_dolly-v2-3b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/databricks/dolly-v2-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">databricks/dolly-v2-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/databricks__dolly-v2-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
databricks/dolly-v2-3b
f6c9be08f16fe4d3a719bee0a4a7c7415b5c65df
5.599658
mit
287
3
true
false
false
false
1.516169
0.224716
22.471598
0.307928
3.324769
0.015106
1.510574
0.260906
1.454139
0.333781
3.222656
0.114528
1.614214
false
true
2023-04-13
2024-06-12
0
databricks/dolly-v2-3b
databricks_dolly-v2-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/databricks/dolly-v2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">databricks/dolly-v2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/databricks__dolly-v2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
databricks/dolly-v2-7b
d632f0c8b75b1ae5b26b250d25bfba4e99cb7c6f
5.64736
mit
149
7
true
false
false
false
1.660412
0.200986
20.098561
0.317306
5.449893
0.01435
1.435045
0.268456
2.46085
0.355302
2.779427
0.114943
1.660387
false
true
2023-04-13
2024-06-12
0
databricks/dolly-v2-7b
davidkim205_Rhea-72b-v0.5_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/davidkim205/Rhea-72b-v0.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">davidkim205/Rhea-72b-v0.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/davidkim205__Rhea-72b-v0.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
davidkim205/Rhea-72b-v0.5
bc3806efb23d2713e6630a748d9747fd76b27169
5.998956
apache-2.0
135
72
true
false
false
false
17.377382
0.014538
1.453809
0.307834
3.670747
0.173716
17.371601
0.252517
0.33557
0.424135
11.316927
0.116606
1.84508
false
false
2024-03-22
2024-09-15
0
davidkim205/Rhea-72b-v0.5
davidkim205_nox-solar-10.7b-v4_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/davidkim205/nox-solar-10.7b-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">davidkim205/nox-solar-10.7b-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/davidkim205__nox-solar-10.7b-v4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
davidkim205/nox-solar-10.7b-v4
5f4be6cb7d8398b84689148d15f3838f2e01e104
18.514321
apache-2.0
10
10.732
true
false
false
true
1.697953
0.375342
37.534187
0.481404
26.631088
0.008308
0.830816
0.307047
7.606264
0.429844
12.563802
0.333278
25.91977
false
false
2024-03-16
2024-10-04
0
davidkim205/nox-solar-10.7b-v4
deepseek-ai_DeepSeek-R1-Distill-Llama-70B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Llama-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">deepseek-ai/DeepSeek-R1-Distill-Llama-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/deepseek-ai__DeepSeek-R1-Distill-Llama-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
deepseek-ai/DeepSeek-R1-Distill-Llama-70B
07a264a567ba0863a4ab34fdb3c2b8a54e0bb494
27.809426
mit
636
70.554
true
false
false
true
118.546466
0.433594
43.359398
0.563496
35.819862
0.307402
30.740181
0.265101
2.013423
0.434219
13.277344
0.474817
41.64635
false
true
2025-01-20
2025-01-22
0
deepseek-ai/DeepSeek-R1-Distill-Llama-70B
deepseek-ai_DeepSeek-R1-Distill-Llama-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Llama-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">deepseek-ai/DeepSeek-R1-Distill-Llama-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/deepseek-ai__DeepSeek-R1-Distill-Llama-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
deepseek-ai/DeepSeek-R1-Distill-Llama-8B
2f96d315ae1d52352452b3c13d12cdd781d762f0
13.05995
mit
659
8.03
true
false
false
true
1.479806
0.37824
37.823974
0.323935
5.325247
0.219789
21.978852
0.255034
0.671141
0.324979
0.455729
0.208943
12.104758
false
true
2025-01-20
2025-01-20
0
deepseek-ai/DeepSeek-R1-Distill-Llama-8B
deepseek-ai_DeepSeek-R1-Distill-Qwen-1.5B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/deepseek-ai__DeepSeek-R1-Distill-Qwen-1.5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B
80da49efd7aed5a338dfa7f23f75a9311f0dec20
10.351037
mit
1,060
1.777
true
false
false
true
1.236411
0.346341
34.634104
0.324099
4.729119
0.169184
16.918429
0.255872
0.782998
0.363458
2.965625
0.118684
2.075946
false
true
2025-01-20
2025-01-20
0
deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B
deepseek-ai_DeepSeek-R1-Distill-Qwen-14B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Qwen-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">deepseek-ai/DeepSeek-R1-Distill-Qwen-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/deepseek-ai__DeepSeek-R1-Distill-Qwen-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
deepseek-ai/DeepSeek-R1-Distill-Qwen-14B
c79f47acaf303faabb7133b4b7b76f24231f2c8d
38.221465
mit
470
14.77
true
false
false
false
3.99219
0.438165
43.816518
0.590557
40.690767
0.570242
57.024169
0.387584
18.344519
0.536625
28.711458
0.466672
40.741356
false
true
2025-01-20
2025-01-20
0
deepseek-ai/DeepSeek-R1-Distill-Qwen-14B
deepseek-ai_DeepSeek-R1-Distill-Qwen-32B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Qwen-32B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">deepseek-ai/DeepSeek-R1-Distill-Qwen-32B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/deepseek-ai__DeepSeek-R1-Distill-Qwen-32B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
deepseek-ai/DeepSeek-R1-Distill-Qwen-32B
4569fd730224ec487752bd4954399c6e18bf3aa6
22.962268
mit
1,279
32.764
true
false
false
true
47.275905
0.418631
41.863145
0.419692
17.149674
0.170695
17.069486
0.284396
4.58613
0.452604
16.142188
0.468667
40.962988
false
true
2025-01-20
2025-01-21
0
deepseek-ai/DeepSeek-R1-Distill-Qwen-32B
deepseek-ai_DeepSeek-R1-Distill-Qwen-7B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Qwen-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">deepseek-ai/DeepSeek-R1-Distill-Qwen-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/deepseek-ai__DeepSeek-R1-Distill-Qwen-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
deepseek-ai/DeepSeek-R1-Distill-Qwen-7B
008b8c2e0b59dac9b7619d58a5ad609f43a5b6b1
14.994923
mit
559
7.616
true
false
false
true
1.369932
0.403769
40.376867
0.344257
7.882703
0.195619
19.561934
0.279362
3.914989
0.366281
3.551823
0.232131
14.68122
false
true
2025-01-20
2025-01-20
0
deepseek-ai/DeepSeek-R1-Distill-Qwen-7B
deepseek-ai_deepseek-llm-67b-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/deepseek-ai/deepseek-llm-67b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">deepseek-ai/deepseek-llm-67b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/deepseek-ai__deepseek-llm-67b-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
deepseek-ai/deepseek-llm-67b-chat
79648bef7658bb824e4630740f6e1484c1b0620b
27.310632
other
198
67
true
false
false
true
119.643617
0.558715
55.871532
0.524342
33.225242
0.0929
9.29003
0.316275
8.836689
0.505865
23.933073
0.394365
32.707225
false
true
2023-11-29
2024-06-12
0
deepseek-ai/deepseek-llm-67b-chat
deepseek-ai_deepseek-llm-7b-base_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/deepseek-ai/deepseek-llm-7b-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">deepseek-ai/deepseek-llm-7b-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/deepseek-ai__deepseek-llm-7b-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
deepseek-ai/deepseek-llm-7b-base
7683fea62db869066ddaff6a41d032262c490d4f
8.227098
other
89
7
true
false
false
false
1.645071
0.217872
21.787191
0.350303
9.767925
0.019637
1.963746
0.27349
3.131991
0.373781
3.75599
0.180602
8.955748
false
true
2023-11-29
2024-06-12
0
deepseek-ai/deepseek-llm-7b-base
deepseek-ai_deepseek-llm-7b-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/deepseek-ai/deepseek-llm-7b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">deepseek-ai/deepseek-llm-7b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/deepseek-ai__deepseek-llm-7b-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
deepseek-ai/deepseek-llm-7b-chat
afbda8b347ec881666061fa67447046fc5164ec8
14.823157
other
148
7
true
false
false
true
1.548965
0.417082
41.708223
0.363208
11.258949
0.020393
2.039275
0.26594
2.12528
0.466771
19.213021
0.213348
12.594193
false
true
2023-11-29
2024-06-12
0
deepseek-ai/deepseek-llm-7b-chat
deepseek-ai_deepseek-moe-16b-base_bfloat16
bfloat16
🟢 pretrained
🟢
Original
DeepseekForCausalLM
<a target="_blank" href="https://huggingface.co/deepseek-ai/deepseek-moe-16b-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">deepseek-ai/deepseek-moe-16b-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/deepseek-ai__deepseek-moe-16b-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
deepseek-ai/deepseek-moe-16b-base
521d2bc4fb69a3f3ae565310fcc3b65f97af2580
7.466334
other
108
16.376
true
true
false
false
14.004931
0.244974
24.497445
0.340946
8.355556
0.024169
2.416918
0.254195
0.559284
0.365781
3.35599
0.150515
5.61281
false
true
2024-01-08
2024-06-12
0
deepseek-ai/deepseek-moe-16b-base
deepseek-ai_deepseek-moe-16b-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
DeepseekForCausalLM
<a target="_blank" href="https://huggingface.co/deepseek-ai/deepseek-moe-16b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">deepseek-ai/deepseek-moe-16b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/deepseek-ai__deepseek-moe-16b-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
deepseek-ai/deepseek-moe-16b-chat
eefd8ac7e8dc90e095129fe1a537d5e236b2e57c
10.290615
other
131
16.376
true
true
false
true
9.186956
0.366299
36.62992
0.327495
6.573749
0.02568
2.567976
0.224832
0
0.38076
5.261719
0.196393
10.710328
false
true
2024-01-09
2024-06-12
0
deepseek-ai/deepseek-moe-16b-chat
dfurman_CalmeRys-78B-Orpo-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/dfurman/CalmeRys-78B-Orpo-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dfurman/CalmeRys-78B-Orpo-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dfurman__CalmeRys-78B-Orpo-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dfurman/CalmeRys-78B-Orpo-v0.1
7988deb48419c3f56bb24c139c23e5c476ec03f8
51.231323
mit
72
77.965
true
false
false
true
25.993535
0.816327
81.632734
0.726228
61.924764
0.406344
40.634441
0.400168
20.022371
0.590177
36.372135
0.701213
66.801492
false
false
2024-09-24
2024-09-24
1
dfurman/CalmeRys-78B-Orpo-v0.1 (Merge)
dfurman_Llama-3-70B-Orpo-v0.1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dfurman/Llama-3-70B-Orpo-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dfurman/Llama-3-70B-Orpo-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dfurman__Llama-3-70B-Orpo-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dfurman/Llama-3-70B-Orpo-v0.1
6bf3be5f7f427164c879f7a4ec9ccb6b22aa6631
18.300061
llama3
2
70.554
true
false
false
true
28.880685
0.204907
20.490742
0.465524
24.093817
0.157855
15.785498
0.25755
1.006711
0.453438
16.279688
0.389295
32.143913
false
false
2024-04-26
2024-08-30
1
dfurman/Llama-3-70B-Orpo-v0.1 (Merge)
dfurman_Llama-3-8B-Orpo-v0.1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/dfurman/Llama-3-8B-Orpo-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dfurman/Llama-3-8B-Orpo-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dfurman__Llama-3-8B-Orpo-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dfurman/Llama-3-8B-Orpo-v0.1
f02aef830e12a50892ac065826d5eb3dfc7675d1
10.89448
llama3
1
8.03
true
false
false
true
1.856159
0.283518
28.351773
0.384242
13.680746
0.052115
5.21148
0.260906
1.454139
0.356635
2.246094
0.229804
14.422651
false
false
2024-04-26
2024-08-30
1
dfurman/Llama-3-8B-Orpo-v0.1 (Merge)
dfurman_Llama-3-8B-Orpo-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dfurman/Llama-3-8B-Orpo-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dfurman/Llama-3-8B-Orpo-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dfurman__Llama-3-8B-Orpo-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dfurman/Llama-3-8B-Orpo-v0.1
f02aef830e12a50892ac065826d5eb3dfc7675d1
11.076158
llama3
1
8.03
true
false
false
true
0.949861
0.300004
30.000399
0.385297
13.773376
0.041541
4.154079
0.261745
1.565996
0.357875
2.734375
0.228059
14.228723
false
false
2024-04-26
2024-08-30
1
dfurman/Llama-3-8B-Orpo-v0.1 (Merge)
dfurman_Qwen2-72B-Orpo-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/dfurman/Qwen2-72B-Orpo-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dfurman/Qwen2-72B-Orpo-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dfurman__Qwen2-72B-Orpo-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dfurman/Qwen2-72B-Orpo-v0.1
26c7bbaa728822c60bb47b2808972140653aae4c
44.1723
other
4
72.699
true
false
false
true
25.250663
0.787976
78.79759
0.696902
57.414364
0.405589
40.558912
0.384228
17.897092
0.478427
20.870052
0.545462
49.495789
false
false
2024-07-05
2024-08-22
1
dfurman/Qwen2-72B-Orpo-v0.1 (Merge)
dicta-il_dictalm2.0_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/dicta-il/dictalm2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dicta-il/dictalm2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dicta-il__dictalm2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dicta-il/dictalm2.0
f8ab3208e95a7b44a9a2fbb9bbbdd8ea11be509d
11.895185
apache-2.0
13
7.251
true
false
false
false
1.348077
0.241327
24.132746
0.401787
16.489846
0.018127
1.812689
0.291946
5.592841
0.381969
5.51276
0.260472
17.83023
false
false
2024-04-10
2024-07-31
0
dicta-il/dictalm2.0
dicta-il_dictalm2.0-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/dicta-il/dictalm2.0-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dicta-il/dictalm2.0-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dicta-il__dictalm2.0-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dicta-il/dictalm2.0-instruct
257c6023d6ac1bfa12110b7b17e7600da7da4e1e
16.779221
apache-2.0
21
7.251
true
false
false
true
1.29679
0.441213
44.121265
0.425608
19.688076
0.022659
2.265861
0.302852
7.04698
0.394583
9.722917
0.260472
17.83023
false
false
2024-04-14
2024-07-31
1
dicta-il/dictalm2.0
distilbert_distilgpt2_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/distilbert/distilgpt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">distilbert/distilgpt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/distilbert__distilgpt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
distilbert/distilgpt2
2290a62682d06624634c1f46a6ad5be0f47f38aa
4.002274
apache-2.0
501
0.088
true
false
false
false
0.246163
0.0611
6.11001
0.303799
2.83522
0.006042
0.60423
0.259228
1.230425
0.420729
11.157813
0.118684
2.075946
false
true
2022-03-02
2024-06-12
0
distilbert/distilgpt2
divyanshukunwar_SASTRI_1_9B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/divyanshukunwar/SASTRI_1_9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">divyanshukunwar/SASTRI_1_9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/divyanshukunwar__SASTRI_1_9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
divyanshukunwar/SASTRI_1_9B
3afeb5b296b1d6489401105e2ea6fc5c00d09c07
19.42176
apache-2.0
0
5.211
true
false
false
true
7.792431
0.420729
42.072922
0.46805
23.534216
0.115559
11.555891
0.321309
9.50783
0.383115
5.55599
0.318733
24.303709
false
false
2024-11-20
2024-11-23
1
divyanshukunwar/SASTRI_1_9B (Merge)
djuna_G2-BigGSHT-27B-2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/G2-BigGSHT-27B-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/G2-BigGSHT-27B-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__G2-BigGSHT-27B-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/G2-BigGSHT-27B-2
b52e0c08d19232acebf85b68ee5989cc23c0d519
36.047053
0
27.227
false
false
false
true
10.050858
0.797443
79.744301
0.641474
48.814372
0.234894
23.489426
0.363255
15.100671
0.407208
9.934375
0.452793
39.199173
false
false
2024-10-29
2024-11-06
1
djuna/G2-BigGSHT-27B-2 (Merge)
djuna_G2-GSHT_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/G2-GSHT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/G2-GSHT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__G2-GSHT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/G2-GSHT
afa34f893a74af2a21b71f83d7bcc16aa818d157
24.632234
0
10.159
false
false
false
true
4.303384
0.563012
56.30117
0.526973
30.992059
0.192598
19.259819
0.325503
10.067114
0.400573
8.171615
0.307015
23.001625
false
false
2024-09-09
2024-10-05
1
djuna/G2-GSHT (Merge)
djuna_Gemma-2-gemmama-9b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/Gemma-2-gemmama-9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/Gemma-2-gemmama-9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__Gemma-2-gemmama-9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/Gemma-2-gemmama-9b
1d6c53ad18970ac082e86bfa0159789b6a6e79c0
28.752477
3
10.159
false
false
false
true
5.528194
0.77034
77.034047
0.542004
32.916051
0.192598
19.259819
0.33557
11.409396
0.403146
8.459896
0.310921
23.435653
false
false
2024-08-31
2024-10-05
1
djuna/Gemma-2-gemmama-9b (Merge)
djuna_L3.1-ForStHS_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/L3.1-ForStHS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/L3.1-ForStHS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__L3.1-ForStHS-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/L3.1-ForStHS
f5442e1f27e4a0c469504624ea85afdc6907c9cc
28.348156
3
8.03
false
false
false
true
1.687329
0.781331
78.133131
0.52027
31.391217
0.150302
15.030211
0.291107
5.480984
0.402646
9.664063
0.373504
30.389332
false
false
2024-09-10
2024-09-15
1
djuna/L3.1-ForStHS (Merge)
djuna_L3.1-Promissum_Mane-8B-Della-1.5-calc_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/L3.1-Promissum_Mane-8B-Della-1.5-calc" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/L3.1-Promissum_Mane-8B-Della-1.5-calc</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__L3.1-Promissum_Mane-8B-Della-1.5-calc-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/L3.1-Promissum_Mane-8B-Della-1.5-calc
67dc71cb877c1ebaeb634e116fc938b223338cf6
29.587663
2
8.03
false
false
false
true
1.488306
0.723529
72.352912
0.543292
34.879576
0.163897
16.389728
0.314597
8.612975
0.425281
13.026823
0.390376
32.263963
false
false
2024-10-29
2024-10-29
1
djuna/L3.1-Promissum_Mane-8B-Della-1.5-calc (Merge)
djuna_L3.1-Promissum_Mane-8B-Della-calc_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/L3.1-Promissum_Mane-8B-Della-calc" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/L3.1-Promissum_Mane-8B-Della-calc</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__L3.1-Promissum_Mane-8B-Della-calc-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/L3.1-Promissum_Mane-8B-Della-calc
42c6cd88b8394876cdbcf64e56633ad0a371b5f4
26.4888
1
8.03
false
false
false
true
1.645439
0.544153
54.415285
0.548588
35.553826
0.18429
18.429003
0.299497
6.599553
0.42299
12.807031
0.380153
31.128103
false
false
2024-10-07
2024-10-20
1
djuna/L3.1-Promissum_Mane-8B-Della-calc (Merge)
djuna_L3.1-Purosani-2-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/L3.1-Purosani-2-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/L3.1-Purosani-2-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__L3.1-Purosani-2-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/L3.1-Purosani-2-8B
e5acd6277a1286c5e18fcb3e89a836ffc8a75b8f
23.113374
3
8.03
false
false
false
true
1.729904
0.498815
49.881537
0.518212
31.391343
0.117069
11.706949
0.301174
6.823266
0.381625
8.303125
0.375166
30.574025
false
false
2024-10-04
2024-10-20
1
djuna/L3.1-Purosani-2-8B (Merge)
djuna_L3.1-Suze-Vume-calc_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/L3.1-Suze-Vume-calc" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/L3.1-Suze-Vume-calc</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__L3.1-Suze-Vume-calc-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/L3.1-Suze-Vume-calc
830c07d136ecd8171805078606f00c4ee69f21c3
26.000784
1
8.03
false
false
false
true
1.609038
0.729674
72.967393
0.516421
31.136638
0.114048
11.404834
0.281879
4.250559
0.384292
8.303125
0.351479
27.942154
false
false
2024-08-26
2024-09-04
1
djuna/L3.1-Suze-Vume-calc (Merge)
djuna_MN-Chinofun_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/MN-Chinofun" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/MN-Chinofun</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__MN-Chinofun-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/MN-Chinofun
71b47c86f32e107b407fada44ec6b893c5eb8bb0
24.683834
3
12.248
false
false
false
true
2.892986
0.611022
61.102209
0.49527
28.483575
0.130665
13.066465
0.296141
6.152125
0.408354
10.377604
0.360289
28.921025
false
false
2024-09-16
2024-09-23
1
djuna/MN-Chinofun (Merge)
djuna_MN-Chinofun-12B-2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/MN-Chinofun-12B-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/MN-Chinofun-12B-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__MN-Chinofun-12B-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/MN-Chinofun-12B-2
d2aab6837c2ad2dfebb18b15549affd9dd2b8723
25.682588
3
12.248
false
false
false
true
1.955393
0.617067
61.706716
0.503696
29.526084
0.130665
13.066465
0.305369
7.38255
0.426833
13.354167
0.361536
29.059545
false
false
2024-10-23
2024-11-26
1
djuna/MN-Chinofun-12B-2 (Merge)
djuna_MN-Chinofun-12B-3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/MN-Chinofun-12B-3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/MN-Chinofun-12B-3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__MN-Chinofun-12B-3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/MN-Chinofun-12B-3
fa64c9bc66221946d7425c4eea93828900083d84
18.389453
2
12.248
false
false
false
true
2.415479
0.305274
30.527445
0.534786
34.219196
0.100453
10.045317
0.26594
2.12528
0.419792
10.907292
0.30261
22.51219
false
false
2024-12-05
2024-12-05
1
djuna/MN-Chinofun-12B-3 (Merge)
djuna_MN-Chinofun-12B-4_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/MN-Chinofun-12B-4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/MN-Chinofun-12B-4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__MN-Chinofun-12B-4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/MN-Chinofun-12B-4
609b6b7bf20de7a6f93559f0d2572ae7b275ed78
24.402912
3
12.248
false
false
false
true
1.691792
0.540431
54.04305
0.534769
34.173042
0.111782
11.178248
0.295302
6.040268
0.430677
13.234635
0.349734
27.748227
false
false
2025-01-26
2025-01-26
1
djuna/MN-Chinofun-12B-4 (Merge)
djuna_Q2.5-Partron-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/Q2.5-Partron-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/Q2.5-Partron-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__Q2.5-Partron-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/Q2.5-Partron-7B
3a6d3cca23c0e1c6bcba38887fc819729d5d16cf
35.108466
0
7.613
false
false
false
true
2.002153
0.732122
73.212188
0.541847
35.257265
0.482628
48.26284
0.297819
6.375839
0.416542
11.067708
0.428275
36.474956
false
false
2024-11-08
2024-11-08
1
djuna/Q2.5-Partron-7B (Merge)
djuna_Q2.5-Veltha-14B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/Q2.5-Veltha-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/Q2.5-Veltha-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__Q2.5-Veltha-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/Q2.5-Veltha-14B
fd0c348618e5c8198b769d2f5ff1e3a810e007e7
42.519512
10
14.766
false
false
false
true
3.21349
0.829167
82.916661
0.648421
49.752432
0.478852
47.885196
0.35906
14.541387
0.419427
12.261719
0.529837
47.759678
false
false
2024-12-22
2024-12-22
1
djuna/Q2.5-Veltha-14B (Merge)
djuna_Q2.5-Veltha-14B-0.5_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/djuna/Q2.5-Veltha-14B-0.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna/Q2.5-Veltha-14B-0.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna__Q2.5-Veltha-14B-0.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna/Q2.5-Veltha-14B-0.5
db20da90002d4b1285f61e2648c4fdbec44e02e7
41.612279
10
14.766
false
false
false
true
2.942923
0.779583
77.958262
0.652303
50.318126
0.437311
43.731118
0.368289
15.771812
0.433906
14.171615
0.529505
47.722739
false
false
2024-12-22
2024-12-22
1
djuna/Q2.5-Veltha-14B-0.5 (Merge)
djuna-test-lab_TEST-L3.2-ReWish-3B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/djuna-test-lab/TEST-L3.2-ReWish-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna-test-lab/TEST-L3.2-ReWish-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna-test-lab__TEST-L3.2-ReWish-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna-test-lab/TEST-L3.2-ReWish-3B
0cb7d434c4647faed475f17d74e9047007cd3782
22.571394
1
3.213
false
false
false
true
1.281262
0.636776
63.677598
0.449541
22.0667
0.136707
13.670695
0.283557
4.474273
0.37775
7.91875
0.312583
23.620346
false
false
2024-10-23
2024-10-24
1
djuna-test-lab/TEST-L3.2-ReWish-3B (Merge)
djuna-test-lab_TEST-L3.2-ReWish-3B-ties-w-base_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/djuna-test-lab/TEST-L3.2-ReWish-3B-ties-w-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">djuna-test-lab/TEST-L3.2-ReWish-3B-ties-w-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/djuna-test-lab__TEST-L3.2-ReWish-3B-ties-w-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
djuna-test-lab/TEST-L3.2-ReWish-3B-ties-w-base
ebab6c0266ae7846b2bb9a595a2651a23b031372
22.545998
0
3.213
false
false
false
true
1.922061
0.635252
63.525224
0.449541
22.0667
0.136707
13.670695
0.283557
4.474273
0.37775
7.91875
0.312583
23.620346
false
false
2024-10-23
2024-10-23
1
djuna-test-lab/TEST-L3.2-ReWish-3B-ties-w-base (Merge)
dnhkng_RYS-Medium_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-Medium" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-Medium</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-Medium-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-Medium
de09a79e6b2efdcc97490a37b770764e62749fd0
26.447752
mit
3
18.731
true
false
false
false
4.272757
0.440613
44.061313
0.628473
47.734201
0.108006
10.800604
0.32802
10.402685
0.406927
8.732552
0.432596
36.955157
false
false
2024-07-17
2024-07-17
0
dnhkng/RYS-Medium
dnhkng_RYS-Llama-3-8B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-Llama-3-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-Llama-3-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-Llama-3-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-Llama-3-8B-Instruct
293ab00d1e2be2752f97d5568fde2b09f6a1caae
21.922775
mit
1
8.248
true
false
false
true
1.610375
0.695777
69.57772
0.480871
25.373015
0.068731
6.873112
0.25755
1.006711
0.338344
0.292969
0.355718
28.413121
false
false
2024-08-06
2024-08-07
0
dnhkng/RYS-Llama-3-8B-Instruct
dnhkng_RYS-Llama-3-Huge-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-Llama-3-Huge-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-Llama-3-Huge-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-Llama-3-Huge-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-Llama-3-Huge-Instruct
cfe14a5339e88a7a89f075d9d48215d45f64acaf
34.644006
mit
2
99.646
true
false
false
true
29.473976
0.768592
76.859178
0.648087
49.073721
0.228852
22.885196
0.260906
1.454139
0.42076
11.928385
0.510971
45.663416
false
false
2024-08-06
2024-08-07
0
dnhkng/RYS-Llama-3-Huge-Instruct
dnhkng_RYS-Llama-3-Large-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-Llama-3-Large-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-Llama-3-Large-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-Llama-3-Large-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-Llama-3-Large-Instruct
01e3208aaf7bf6d2b09737960c701ec6628977fe
35.981216
mit
1
73.976
true
false
false
true
19.623034
0.805062
80.506168
0.652527
49.665539
0.230363
23.036254
0.28943
5.257271
0.418031
11.453906
0.513713
45.968159
false
false
2024-08-06
2024-08-07
0
dnhkng/RYS-Llama-3-Large-Instruct
dnhkng_RYS-Llama-3.1-8B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-Llama-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-Llama-3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-Llama-3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-Llama-3.1-8B-Instruct
d4e2393403dcae19860da7c29519c8fe6fbf2fad
26.763955
mit
10
8.685
true
false
false
true
1.943344
0.768492
76.849205
0.516365
31.085445
0.132931
13.293051
0.267617
2.348993
0.368104
7.679688
0.363946
29.327349
false
false
2024-08-08
2024-08-30
0
dnhkng/RYS-Llama-3.1-8B-Instruct
dnhkng_RYS-Llama3.1-Large_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-Llama3.1-Large" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-Llama3.1-Large</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-Llama3.1-Large-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-Llama3.1-Large
52cc979de78155b33689efa48f52a8aab184bd86
42.705292
mit
1
81.677
true
false
false
true
30.812658
0.8492
84.920012
0.689911
55.414864
0.350453
35.045317
0.374161
16.55481
0.455396
17.091146
0.52485
47.2056
false
false
2024-08-11
2024-08-22
0
dnhkng/RYS-Llama3.1-Large
dnhkng_RYS-Phi-3-medium-4k-instruct_bfloat16
bfloat16
🟢 pretrained
🟢
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-Phi-3-medium-4k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-Phi-3-medium-4k-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-Phi-3-medium-4k-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-Phi-3-medium-4k-instruct
1009e916b1ff8c9a53bc9d8ff48bea2a15ccde26
29.09369
mit
1
17.709
true
false
false
false
4.621093
0.439139
43.913926
0.622631
46.748971
0.160876
16.087613
0.354866
13.982103
0.425281
11.09349
0.484624
42.736037
false
false
2024-08-06
2024-08-07
0
dnhkng/RYS-Phi-3-medium-4k-instruct
dnhkng_RYS-XLarge_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-XLarge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-XLarge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-XLarge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-XLarge
0f84dd9dde60f383e1e2821496befb4ce9a11ef6
45.34522
mit
85
77.965
true
false
false
false
27.152166
0.799566
79.956626
0.705003
58.773567
0.425227
42.522659
0.384228
17.897092
0.496969
23.721094
0.542803
49.200281
false
false
2024-07-24
2024-08-07
0
dnhkng/RYS-XLarge
dnhkng_RYS-XLarge-base_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-XLarge-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-XLarge-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-XLarge-base-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-XLarge-base
c718b3d9e24916e3b0347d3fdaa5e5a097c2f603
44.096836
mit
8
77.972
true
false
false
true
27.175047
0.791023
79.102337
0.704729
58.692146
0.379154
37.915408
0.379195
17.225951
0.490271
22.417188
0.543052
49.227985
false
false
2024-08-02
2024-08-30
0
dnhkng/RYS-XLarge-base
dnhkng_RYS-XLarge2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/dnhkng/RYS-XLarge2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dnhkng/RYS-XLarge2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dnhkng__RYS-XLarge2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dnhkng/RYS-XLarge2
3ce16c9427e93e09ce10a28fa644469d49a51113
35.052228
0
77.965
false
false
false
true
26.751769
0.490197
49.019712
0.657395
51.549936
0.274924
27.492447
0.374161
16.55481
0.450802
17.05026
0.537816
48.646203
false
false
2024-10-11
0
Removed
dreamgen_WizardLM-2-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/dreamgen/WizardLM-2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dreamgen/WizardLM-2-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dreamgen__WizardLM-2-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dreamgen/WizardLM-2-7B
b5f2d7bff91445a47331dcce588aee009d11d255
14.877543
apache-2.0
36
7.242
true
false
false
true
1.13345
0.458298
45.829843
0.348679
9.213114
0.033233
3.323263
0.286913
4.9217
0.394094
7.528385
0.266041
18.448951
false
false
2024-04-16
2024-06-27
0
dreamgen/WizardLM-2-7B
dustinwloring1988_Reflexis-8b-chat-v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dustinwloring1988/Reflexis-8b-chat-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dustinwloring1988/Reflexis-8b-chat-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dustinwloring1988__Reflexis-8b-chat-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dustinwloring1988/Reflexis-8b-chat-v1
e96bd9694ae87a4f612825310eb7afaea5b0aa28
17.353239
0
8.03
false
false
false
true
1.782283
0.365775
36.577503
0.46636
24.109958
0.115559
11.555891
0.254195
0.559284
0.375396
4.824479
0.338431
26.492317
false
false
2024-09-14
0
Removed
dustinwloring1988_Reflexis-8b-chat-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dustinwloring1988/Reflexis-8b-chat-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dustinwloring1988/Reflexis-8b-chat-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dustinwloring1988__Reflexis-8b-chat-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dustinwloring1988/Reflexis-8b-chat-v2
817408ebfaa7ba0ea9433e1de4bfa120d38d2a0f
18.276634
0
8.03
false
false
false
true
1.880739
0.391204
39.120423
0.47238
24.892196
0.116314
11.63142
0.270134
2.684564
0.352635
4.91276
0.337766
26.41844
false
false
2024-09-14
0
Removed
dustinwloring1988_Reflexis-8b-chat-v3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dustinwloring1988/Reflexis-8b-chat-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dustinwloring1988/Reflexis-8b-chat-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dustinwloring1988__Reflexis-8b-chat-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dustinwloring1988/Reflexis-8b-chat-v3
dcfa1a6a9f94a099286891d732b17cbbe97a644e
20.525441
0
8.03
false
false
false
true
1.782934
0.536734
53.673364
0.465831
24.168293
0.122356
12.23565
0.24245
0
0.351177
4.763802
0.354804
28.31154
false
false
2024-09-14
0
Removed
dustinwloring1988_Reflexis-8b-chat-v4_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dustinwloring1988/Reflexis-8b-chat-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dustinwloring1988/Reflexis-8b-chat-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dustinwloring1988__Reflexis-8b-chat-v4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dustinwloring1988/Reflexis-8b-chat-v4
81e20c2e40f2028818d5d6d27ec9e0d503ae8cc1
18.530939
0
8.03
false
false
false
true
1.77054
0.469789
46.978905
0.468601
24.33177
0.102719
10.271903
0.23406
0
0.339302
3.046094
0.339013
26.556959
false
false
2024-09-14
0
Removed
dustinwloring1988_Reflexis-8b-chat-v5_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dustinwloring1988/Reflexis-8b-chat-v5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dustinwloring1988/Reflexis-8b-chat-v5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dustinwloring1988__Reflexis-8b-chat-v5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dustinwloring1988/Reflexis-8b-chat-v5
12970eec99f458a3982eb502b71b6df0bc74bb52
18.53627
0
8.03
false
false
false
true
1.826193
0.423752
42.375231
0.478169
25.195784
0.121601
12.160121
0.270973
2.796421
0.335365
4.053906
0.321725
24.636155
false
false
2024-09-14
0
Removed
dustinwloring1988_Reflexis-8b-chat-v6_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dustinwloring1988/Reflexis-8b-chat-v6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dustinwloring1988/Reflexis-8b-chat-v6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dustinwloring1988__Reflexis-8b-chat-v6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dustinwloring1988/Reflexis-8b-chat-v6
a0b30a21a8eea9a32a2767755dc2dbd44eeb383f
20.344892
0
8.03
false
false
false
true
1.798406
0.493894
49.389398
0.480954
26.116103
0.129909
12.990937
0.262584
1.677852
0.375333
4.35
0.347906
27.545065
false
false
2024-09-14
0
Removed
dustinwloring1988_Reflexis-8b-chat-v7_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/dustinwloring1988/Reflexis-8b-chat-v7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dustinwloring1988/Reflexis-8b-chat-v7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dustinwloring1988__Reflexis-8b-chat-v7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dustinwloring1988/Reflexis-8b-chat-v7
e8d990012ccd855e65d51cb7cfd1762632a8f217
19.095501
0
8
false
false
false
true
1.804222
0.398048
39.804829
0.480983
25.987497
0.163142
16.314199
0.261745
1.565996
0.322156
1.536198
0.364279
29.364288
false
false
2024-09-14
0
Removed
duyhv1411_Llama-3.2-1B-en-vi_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/duyhv1411/Llama-3.2-1B-en-vi" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">duyhv1411/Llama-3.2-1B-en-vi</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/duyhv1411__Llama-3.2-1B-en-vi-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
duyhv1411/Llama-3.2-1B-en-vi
d08c530d8256a72ad9548b0f26416ee98eae22ac
10.858146
1
1.236
false
false
false
true
0.379454
0.478832
47.883172
0.329091
6.0924
0.028701
2.870091
0.276846
3.579418
0.319708
0.930208
0.134142
3.793587
false
false
2025-03-06
2025-03-06
1
meta-llama/Llama-3.2-1B-Instruct
duyhv1411_Llama-3.2-3B-en-vi_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/duyhv1411/Llama-3.2-3B-en-vi" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">duyhv1411/Llama-3.2-3B-en-vi</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/duyhv1411__Llama-3.2-3B-en-vi-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
duyhv1411/Llama-3.2-3B-en-vi
7c5c74623642a3dc50de1f195babf32b8584fe90
10.861409
1
1.236
false
false
false
true
0.368319
0.485201
48.520149
0.327164
5.946255
0.022659
2.265861
0.275168
3.355705
0.32101
1.092969
0.135888
3.987515
false
false
2025-03-06
2025-03-06
1
meta-llama/Llama-3.2-1B-Instruct
dwikitheduck_gemma-2-2b-id_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/dwikitheduck/gemma-2-2b-id" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dwikitheduck/gemma-2-2b-id</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dwikitheduck__gemma-2-2b-id-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dwikitheduck/gemma-2-2b-id
6f191d4a7618664619adda1cd96d9d1bf72f33b2
14.849648
gemma
0
2
true
false
false
true
6.048664
0.387856
38.785644
0.396217
15.415129
0.045317
4.531722
0.299497
6.599553
0.415427
10.728385
0.217337
13.037456
false
false
2024-10-24
2024-11-14
0
dwikitheduck/gemma-2-2b-id
dwikitheduck_gemma-2-2b-id-inst_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/dwikitheduck/gemma-2-2b-id-inst" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dwikitheduck/gemma-2-2b-id-inst</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dwikitheduck__gemma-2-2b-id-inst-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dwikitheduck/gemma-2-2b-id-inst
1c046ade199128da926004e154698546d65e3084
14.849648
gemma
0
2
true
false
false
true
2.820793
0.387856
38.785644
0.396217
15.415129
0.045317
4.531722
0.299497
6.599553
0.415427
10.728385
0.217337
13.037456
false
false
2024-10-24
2024-11-24
0
dwikitheduck/gemma-2-2b-id-inst
dwikitheduck_gemma-2-2b-id-instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/dwikitheduck/gemma-2-2b-id-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dwikitheduck/gemma-2-2b-id-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dwikitheduck__gemma-2-2b-id-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dwikitheduck/gemma-2-2b-id-instruct
1c046ade199128da926004e154698546d65e3084
14.849648
gemma
0
2
true
false
false
true
2.833815
0.387856
38.785644
0.396217
15.415129
0.045317
4.531722
0.299497
6.599553
0.415427
10.728385
0.217337
13.037456
false
false
2024-10-24
2024-11-15
0
dwikitheduck/gemma-2-2b-id-instruct
dwikitheduck_gen-inst-1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/dwikitheduck/gen-inst-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dwikitheduck/gen-inst-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dwikitheduck__gen-inst-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dwikitheduck/gen-inst-1
73180b0a57469bbd12f7d037a1cc25e53c252ad6
40.880198
apache-2.0
0
14.77
true
false
false
true
3.059745
0.775011
77.501141
0.641993
48.316742
0.455438
45.543807
0.371644
16.219239
0.420542
12.267708
0.508893
45.43255
false
false
2024-11-18
2024-11-24
2
Qwen/Qwen2.5-14B
dwikitheduck_gen-try1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/dwikitheduck/gen-try1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dwikitheduck/gen-try1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dwikitheduck__gen-try1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dwikitheduck/gen-try1
9c2cab728518e179e5d8891f3f9775515f15cea2
39.412127
apache-2.0
0
14.77
true
false
false
true
3.166161
0.752205
75.220526
0.635851
47.413129
0.410121
41.012085
0.341443
12.192394
0.441563
14.961979
0.511054
45.672651
false
false
2024-11-11
2024-11-12
1
dwikitheduck/gen-try1 (Merge)
dwikitheduck_gen-try1-notemp_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/dwikitheduck/gen-try1-notemp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dwikitheduck/gen-try1-notemp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dwikitheduck__gen-try1-notemp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dwikitheduck/gen-try1-notemp
391925b02f6cd60e7c4ef1321fe89a92d6b9fdf0
30.399295
0
14.77
false
false
false
false
3.791212
0.26271
26.270961
0.626267
45.749093
0.317976
31.797583
0.354027
13.870246
0.471417
17.927083
0.521027
46.780807
false
false
2024-11-13
0
Removed
dzakwan_dzakwan-MoE-4x7b-Beta_float16
float16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/dzakwan/dzakwan-MoE-4x7b-Beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">dzakwan/dzakwan-MoE-4x7b-Beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/dzakwan__dzakwan-MoE-4x7b-Beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
dzakwan/dzakwan-MoE-4x7b-Beta
e89f82f2afa1961335de5a6d6d05bd850d1d61d9
20.769303
apache-2.0
0
24.154
true
true
false
false
2.912057
0.44426
44.426012
0.514044
32.074208
0.077795
7.779456
0.286074
4.809843
0.42674
12.109115
0.310755
23.417184
true
false
2024-05-26
2024-08-05
1
dzakwan/dzakwan-MoE-4x7b-Beta (Merge)
ehristoforu_Falcon3-8B-Franken-Basestruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/Falcon3-8B-Franken-Basestruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/Falcon3-8B-Franken-Basestruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__Falcon3-8B-Franken-Basestruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/Falcon3-8B-Franken-Basestruct
627cee5966188907ea34e4b473f655606fe82e5a
16.438747
0
8.406
false
false
false
true
1.593017
0.171485
17.148499
0.546283
34.856419
0
0
0.340604
12.080537
0.35549
1.802865
0.394697
32.744164
false
false
2025-01-06
2025-01-06
1
ehristoforu/Falcon3-8B-Franken-Basestruct (Merge)
ehristoforu_Falcon3-MoE-2x7B-Insruct_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/Falcon3-MoE-2x7B-Insruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/Falcon3-MoE-2x7B-Insruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__Falcon3-MoE-2x7B-Insruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/Falcon3-MoE-2x7B-Insruct
d7c85f436d22685010165483ba966d6ee2336cc8
36.667651
other
8
13.401
true
true
false
true
3.272421
0.764295
76.42954
0.56479
38.067154
0.412387
41.238671
0.312081
8.277405
0.484042
21.605208
0.409491
34.387928
false
false
2024-12-22
2024-12-22
1
ehristoforu/Falcon3-MoE-2x7B-Insruct (Merge)
ehristoforu_Gemma2-9B-it-psy10k-mental_health_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/Gemma2-9B-it-psy10k-mental_health" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/Gemma2-9B-it-psy10k-mental_health</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__Gemma2-9B-it-psy10k-mental_health-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/Gemma2-9B-it-psy10k-mental_health
4adc2d61d530d23026493d29e6191e06cf549fc6
27.192489
apache-2.0
2
9.242
true
false
false
true
4.55366
0.588666
58.866585
0.553938
35.566009
0.163142
16.314199
0.337248
11.63311
0.408604
9.342188
0.382896
31.432846
false
false
2024-07-16
2024-07-31
4
google/gemma-2-9b
ehristoforu_Gemma2-9b-it-train6_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/Gemma2-9b-it-train6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/Gemma2-9b-it-train6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__Gemma2-9b-it-train6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/Gemma2-9b-it-train6
e72bf00b427c22c48b468818cf75300a373a0c8a
30.533987
apache-2.0
2
9.242
true
false
false
true
3.987367
0.702522
70.252153
0.589809
40.987625
0.191088
19.108761
0.328859
10.514541
0.408417
9.652083
0.394199
32.688756
false
false
2024-07-22
2024-07-31
8
google/gemma-2-9b
ehristoforu_HappyLlama1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/HappyLlama1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/HappyLlama1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__HappyLlama1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/HappyLlama1
9bee1c404de70fc0ebe3cbcd2af2303a313a24be
26.735379
apache-2.0
0
8.03
true
false
false
true
1.428721
0.736269
73.626866
0.499573
28.499773
0.142749
14.274924
0.283557
4.474273
0.428687
11.252604
0.354555
28.283836
false
false
2024-11-29
2024-11-30
1
voidful/Llama-3.2-8B-Instruct
ehristoforu_QwenQwen2.5-7B-IT_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/QwenQwen2.5-7B-IT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/QwenQwen2.5-7B-IT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__QwenQwen2.5-7B-IT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/QwenQwen2.5-7B-IT
fec28a5fa8a3139c24b67a9e7092a0175b801872
35.581218
1
7.613
false
false
false
true
1.308465
0.751831
75.18307
0.539796
34.969654
0.509063
50.906344
0.303691
7.158837
0.403365
8.720573
0.428939
36.548833
false
false
2025-01-29
2025-01-30
1
ehristoforu/QwenQwen2.5-7B-IT (Merge)
ehristoforu_QwenQwen2.5-7B-IT-Dare_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/QwenQwen2.5-7B-IT-Dare" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/QwenQwen2.5-7B-IT-Dare</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__QwenQwen2.5-7B-IT-Dare-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/QwenQwen2.5-7B-IT-Dare
376d1c82e6fd973fb927f8540535d39e8f4c6168
35.565815
1
7.613
false
false
false
true
1.333669
0.750906
75.090648
0.539796
34.969654
0.509063
50.906344
0.303691
7.158837
0.403365
8.720573
0.428939
36.548833
false
false
2025-01-29
2025-01-30
1
ehristoforu/QwenQwen2.5-7B-IT-Dare (Merge)
ehristoforu_RQwen-v0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/RQwen-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/RQwen-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__RQwen-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/RQwen-v0.1
96d013d2db2ae47be9da1d1cd5b83782bd8f4096
39.730757
apache-2.0
2
14.77
true
false
false
true
3.417828
0.762497
76.249684
0.644644
48.490852
0.464502
46.450151
0.325503
10.067114
0.413906
10.438281
0.520196
46.68846
false
false
2024-11-24
2024-11-24
1
ehristoforu/RQwen-v0.1 (Merge)
ehristoforu_RQwen-v0.2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/RQwen-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/RQwen-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__RQwen-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/RQwen-v0.2
102ff435814388f4da9e7ebc25c5fbae7120638a
37.702469
apache-2.0
1
14.77
true
false
false
true
2.588487
0.750357
75.035683
0.642689
48.683837
0.327039
32.703927
0.337248
11.63311
0.420667
11.95
0.515874
46.208259
false
false
2024-11-24
2024-11-25
2
ehristoforu/RQwen-v0.1 (Merge)
ehristoforu_SoRu-0009_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/SoRu-0009" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/SoRu-0009</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__SoRu-0009-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/SoRu-0009
fe4f439882175c3cad8a0f08f7b14d18318b53d1
6.300241
apache-2.0
0
0.494
true
false
false
true
1.023862
0.258188
25.818827
0.314998
5.137458
0.021148
2.114804
0.260906
1.454139
0.336948
0.61849
0.12392
2.657728
false
false
2024-11-26
2024-11-27
10
Vikhrmodels/Vikhr-Qwen-2.5-0.5b-Instruct (Merge)
ehristoforu_coolqwen-3b-it_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/coolqwen-3b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/coolqwen-3b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__coolqwen-3b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/coolqwen-3b-it
5045993527b5da13a71e5b3df8c649bd55425124
28.654354
other
2
3.085
true
false
false
true
1.445891
0.647267
64.726703
0.485089
27.463696
0.367069
36.706949
0.282718
4.362416
0.41251
9.763802
0.360123
28.902556
false
false
2025-01-02
2025-01-02
1
Qwen/Qwen2.5-3B
ehristoforu_della-70b-test-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/della-70b-test-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/della-70b-test-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__della-70b-test-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/della-70b-test-v1
c705bbf0e900dc5375325e8ff80de0525aa713e4
12.869435
0
70.554
false
false
false
true
112.555403
0.497866
49.786566
0.302945
3.358456
0.009819
0.981873
0.252517
0.33557
0.455458
16.365625
0.157497
6.38852
false
false
2025-02-02
2025-02-06
1
ehristoforu/della-70b-test-v1 (Merge)
ehristoforu_falcon3-ultraset_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/falcon3-ultraset" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/falcon3-ultraset</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__falcon3-ultraset-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/falcon3-ultraset
50f5fd7e00b64eb515205e47f4acf28daf224055
32.536887
apache-2.0
0
7.456
true
false
false
true
1.236582
0.713512
71.351237
0.558368
37.555134
0.212236
21.223565
0.332215
10.961969
0.485313
20.997396
0.398188
33.132018
false
false
2024-12-30
2025-01-02
2
tiiuae/Falcon3-7B-Base
ehristoforu_fd-lora-merged-16x32_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/fd-lora-merged-16x32" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/fd-lora-merged-16x32</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__fd-lora-merged-16x32-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/fd-lora-merged-16x32
e31dfdc1714a914452489af06fe226ac3495b4c6
10.454538
mit
0
1.776
true
false
false
true
1.260966
0.34809
34.808974
0.330756
6.52718
0.170695
17.069486
0.253356
0.447427
0.351427
1.595052
0.120512
2.279108
false
false
2025-02-02
2025-02-02
0
ehristoforu/fd-lora-merged-16x32
ehristoforu_fd-lora-merged-64x128_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/fd-lora-merged-64x128" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/fd-lora-merged-64x128</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__fd-lora-merged-64x128-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/fd-lora-merged-64x128
04903de3e1bd71db5421ccd12f7c19d9b3cf7e04
11.210864
mit
0
1.777
true
false
false
true
1.274785
0.328106
32.810609
0.334471
7.819061
0.187311
18.731118
0.255034
0.671141
0.336823
1.269531
0.153674
5.963726
false
false
2025-02-02
2025-02-02
0
ehristoforu/fd-lora-merged-64x128
ehristoforu_fp4-14b-it-v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/fp4-14b-it-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/fp4-14b-it-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__fp4-14b-it-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/fp4-14b-it-v1
3fc21d9a548a3f13b57a40c9f077ba1afc8ed20e
18.698031
0
14.66
false
false
false
true
1.927938
0.253467
25.346747
0.573972
38.779522
0.040785
4.07855
0.295302
6.040268
0.35949
2.336198
0.420462
35.6069
false
false
2025-01-19
2025-01-19
1
ehristoforu/fp4-14b-it-v1 (Merge)
ehristoforu_fp4-14b-v1-fix_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/fp4-14b-v1-fix" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/fp4-14b-v1-fix</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__fp4-14b-v1-fix-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/fp4-14b-v1-fix
bc84457e330589381ec4ac00d697985b74e62dd1
40.373576
0
14.66
false
false
false
true
1.934558
0.67417
67.417009
0.681727
54.33378
0.420695
42.069486
0.354027
13.870246
0.453188
16.181771
0.535322
48.369164
false
false
2025-01-20
2025-01-20
1
ehristoforu/fp4-14b-v1-fix (Merge)
ehristoforu_fq2.5-7b-it-normalize_false_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/fq2.5-7b-it-normalize_false" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/fq2.5-7b-it-normalize_false</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__fq2.5-7b-it-normalize_false-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/fq2.5-7b-it-normalize_false
f64301d81ec5441eb79fe263e215ad24e1c111cc
36.496315
2
7.616
false
false
false
true
1.375598
0.739916
73.991565
0.551986
36.358312
0.462236
46.223565
0.302013
6.935123
0.461156
17.544531
0.441323
37.924793
false
false
2025-01-16
2025-01-16
1
ehristoforu/fq2.5-7b-it-normalize_false (Merge)
ehristoforu_fq2.5-7b-it-normalize_true_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/fq2.5-7b-it-normalize_true" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/fq2.5-7b-it-normalize_true</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__fq2.5-7b-it-normalize_true-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/fq2.5-7b-it-normalize_true
0100b08ab93ee049be9e43532e2caf05221da773
36.496315
1
7.616
false
false
false
true
1.370156
0.739916
73.991565
0.551986
36.358312
0.462236
46.223565
0.302013
6.935123
0.461156
17.544531
0.441323
37.924793
false
false
2025-01-16
2025-01-16
1
ehristoforu/fq2.5-7b-it-normalize_true (Merge)
ehristoforu_frqwen2.5-from7b-duable4layers-it_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/frqwen2.5-from7b-duable4layers-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/frqwen2.5-from7b-duable4layers-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__frqwen2.5-from7b-duable4layers-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/frqwen2.5-from7b-duable4layers-it
3322c001e5dfbb301f42f0bac355f2af61137608
34.626104
0
8.545
false
false
false
true
1.56421
0.772888
77.288816
0.526356
33.959778
0.450906
45.090634
0.295302
6.040268
0.416573
10.638281
0.41265
34.738845
false
false
2025-01-17
2025-01-17
1
ehristoforu/frqwen2.5-from7b-duable4layers-it (Merge)
ehristoforu_frqwen2.5-from7b-it_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/frqwen2.5-from7b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/frqwen2.5-from7b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__frqwen2.5-from7b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/frqwen2.5-from7b-it
b8ebb360a763020f3bf0bbf0115e42a9d325b7e0
28.846489
0
13.206
false
false
false
true
3.930742
0.653212
65.321237
0.514291
30.71074
0.292296
29.229607
0.290268
5.369128
0.408573
9.371615
0.397689
33.076611
false
false
2025-01-17
2025-01-17
1
ehristoforu/frqwen2.5-from7b-it (Merge)
ehristoforu_mllama-3.1-8b-instruct_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/mllama-3.1-8b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/mllama-3.1-8b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__mllama-3.1-8b-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/mllama-3.1-8b-instruct
f7be209bee659916c03b6a3b77e67237cfed2c12
20.353062
1
8.03
false
false
false
true
1.489237
0.345791
34.579139
0.471766
26.370934
0.377644
37.76435
0.270134
2.684564
0.338
3.683333
0.253324
17.036052
false
false
2024-12-04
2024-12-04
1
ehristoforu/mllama-3.1-8b-instruct (Merge)
ehristoforu_mllama-3.1-8b-it_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/mllama-3.1-8b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/mllama-3.1-8b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__mllama-3.1-8b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/mllama-3.1-8b-it
5dc167a466759e5d60c073dca4e938463e2fd813
22.177602
0
8.03
false
false
false
false
1.467605
0.387882
38.788193
0.486803
28.024834
0.379909
37.990937
0.276846
3.579418
0.334865
6.658073
0.262217
18.024158
false
false
2024-12-04
2024-12-04
1
ehristoforu/mllama-3.1-8b-it (Merge)
ehristoforu_moremerge_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/ehristoforu/moremerge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ehristoforu/moremerge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ehristoforu__moremerge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ehristoforu/moremerge
6f56bba4a2d82482b269dcbab69513b6f18cefe2
4.558916
0
7.613
false
false
false
true
1.428755
0.20191
20.190982
0.286844
1.987597
0
0
0.260067
1.342282
0.356573
3.104948
0.106549
0.727689
false
false
2025-01-27
2025-01-27
1
ehristoforu/moremerge (Merge)