eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
64 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.09k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.22
0.83
BBH
float64
0.25
76.7
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.7
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
525 values
Submission Date
stringclasses
263 values
Generation
int64
0
10
Base Model
stringlengths
4
102
T145_ZEUS-8B-V2L2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V2L2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V2L2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V2L2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V2L2
d3ae250942e4b749c2d545a48f08a93a659a9b6e
29.935926
0
8.03
false
false
false
true
1.408079
0.802064
80.206408
0.520284
32.017509
0.201662
20.166163
0.299497
6.599553
0.397469
8.583594
0.388381
32.042332
false
false
2024-12-03
0
Removed
T145_ZEUS-8B-V3_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V3
2253fa275c722d46dd6380539042ec7f1bc0d7f7
29.604928
0
8.03
false
false
false
true
1.212701
0.788675
78.867516
0.526506
32.108252
0.167674
16.767372
0.322148
9.619687
0.401688
9.110937
0.380402
31.155807
false
false
2024-12-05
0
Removed
T145_ZEUS-8B-V30_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V30" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V30</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V30-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V30
7091cc3981243056ed3fae72307f4fac4fa367e4
29.095771
llama3.1
1
8.03
true
false
false
true
2.065009
0.743563
74.356264
0.524325
32.188254
0.15861
15.861027
0.32047
9.395973
0.402927
10.065885
0.394365
32.707225
true
false
2025-02-07
2025-02-07
1
T145/ZEUS-8B-V30 (Merge)
T145_ZEUS-8B-V4_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V4
ca89fdfe275397f430092a0f644dc02b22ba2a8b
29.654622
0
8.03
false
false
false
true
1.301152
0.780732
78.073179
0.524597
32.046144
0.192598
19.259819
0.307047
7.606264
0.402896
9.961979
0.378823
30.980349
false
false
2024-12-06
0
Removed
T145_ZEUS-8B-V6_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V6
d7131128560dce428c3308ab46d7955b749c726d
29.757037
1
8.03
false
false
false
true
1.289855
0.783779
78.377926
0.523956
32.077848
0.202417
20.241692
0.30453
7.270694
0.406802
9.916927
0.375914
30.657137
false
false
2024-12-08
2024-12-09
1
T145/ZEUS-8B-V6 (Merge)
T145_ZEUS-8B-V7_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V7
dbaa3828be77d925f40ecf3762b90ec4ad70e6d9
28.470022
llama3.1
2
8.03
true
false
false
true
1.311759
0.778609
77.860854
0.507039
29.556016
0.148036
14.803625
0.29698
6.263982
0.416167
11.0875
0.381233
31.248153
true
false
2024-12-10
2024-12-11
1
T145/ZEUS-8B-V7 (Merge)
T145_ZEUS-8B-V8_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V8
c7da6c67926ddaff25602bfd1b9941d9822c1387
28.223541
llama3.1
3
8.03
true
false
false
true
1.320028
0.791398
79.139794
0.506451
29.394031
0.132931
13.293051
0.287752
5.033557
0.421375
11.805208
0.37608
30.675606
true
false
2024-12-12
2024-12-12
1
T145/ZEUS-8B-V8 (Merge)
T145_ZEUS-8B-V9_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/T145/ZEUS-8B-V9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/ZEUS-8B-V9</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__ZEUS-8B-V9-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/ZEUS-8B-V9
10b386571ad34d115433419d30b61746ef4d9735
25.864889
1
8.03
false
false
false
true
1.34642
0.555144
55.514369
0.520726
31.85055
0.213746
21.374622
0.291107
5.480984
0.394927
8.732552
0.390126
32.236259
false
false
2024-12-21
2024-12-21
1
T145/ZEUS-8B-V9 (Merge)
T145_qwen-2.5-3B-merge-test_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/T145/qwen-2.5-3B-merge-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">T145/qwen-2.5-3B-merge-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/T145__qwen-2.5-3B-merge-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
T145/qwen-2.5-3B-merge-test
0d5f82d841f811fbf1ee07bfbf7c6eb1de812840
25.975399
0
3.397
false
false
false
true
1.567913
0.575102
57.510184
0.484249
27.889341
0.320242
32.024169
0.285235
4.697987
0.400729
8.291146
0.328956
25.439569
false
false
2024-11-16
0
Removed
THUDM_glm-4-9b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
ChatGLMModelM
<a target="_blank" href="https://huggingface.co/THUDM/glm-4-9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">THUDM/glm-4-9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/THUDM__glm-4-9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
THUDM/glm-4-9b
99a140996f9d4f197842fb6b1aab217a42e27ef3
18.006732
other
126
9
true
false
false
false
3.344894
0.142608
14.260828
0.552837
35.811284
0
0
0.316275
8.836689
0.438583
14.189583
0.414478
34.942007
false
false
2024-06-04
2024-07-04
0
THUDM/glm-4-9b
THUDM_glm-4-9b-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
ChatGLMModelM
<a target="_blank" href="https://huggingface.co/THUDM/glm-4-9b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">THUDM/glm-4-9b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/THUDM__glm-4-9b-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
THUDM/glm-4-9b-chat
04419001bc63e05e70991ade6da1f91c4aeec278
10.973477
other
669
9
true
false
false
true
0.494269
0
0
0.473639
25.205184
0
0
0.313758
8.501119
0.399427
8.061719
0.316656
24.072843
false
false
2024-06-04
2024-07-09
0
THUDM/glm-4-9b-chat
THUDM_glm-4-9b-chat-1m_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
ChatGLMModel
<a target="_blank" href="https://huggingface.co/THUDM/glm-4-9b-chat-1m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">THUDM/glm-4-9b-chat-1m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/THUDM__glm-4-9b-chat-1m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
THUDM/glm-4-9b-chat-1m
0aa722c7e0745dd21453427dd44c257dd253304f
8.92251
other
186
9.484
true
false
false
true
0.41134
0
0
0.418006
17.108029
0
0
0.303691
7.158837
0.379458
5.232292
0.316323
24.035904
false
false
2024-06-04
2024-10-09
0
THUDM/glm-4-9b-chat-1m
THUDM_glm-4-9b-chat-1m-hf_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GlmForCausalLM
<a target="_blank" href="https://huggingface.co/THUDM/glm-4-9b-chat-1m-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">THUDM/glm-4-9b-chat-1m-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/THUDM__glm-4-9b-chat-1m-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
THUDM/glm-4-9b-chat-1m-hf
0588cb62942f0f0a5545c695e5c1b019d64eabdc
15.139214
other
10
9.484
true
false
false
true
2.094132
0.534111
53.41106
0.390095
14.405441
0.048338
4.833837
0.291946
5.592841
0.368885
3.54401
0.181433
9.048094
false
false
2024-10-24
2025-01-15
1
THUDM/glm-4-9b-chat-1m-hf (Merge)
THUDM_glm-4-9b-chat-hf_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GlmForCausalLM
<a target="_blank" href="https://huggingface.co/THUDM/glm-4-9b-chat-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">THUDM/glm-4-9b-chat-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/THUDM__glm-4-9b-chat-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
THUDM/glm-4-9b-chat-hf
c7f73fd9e0f378c87f3c8f2c25aec6ad705043cd
20.544313
other
11
9.4
true
false
false
true
1.984788
0.651314
65.131407
0.443231
20.668086
0.084592
8.459215
0.302852
7.04698
0.359302
2.246094
0.277427
19.714096
false
false
2024-10-23
2025-01-15
1
THUDM/glm-4-9b-chat-hf (Merge)
TIGER-Lab_AceCodeRM-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalRM
<a target="_blank" href="https://huggingface.co/TIGER-Lab/AceCodeRM-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TIGER-Lab/AceCodeRM-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TIGER-Lab__AceCodeRM-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TIGER-Lab/AceCodeRM-7B
cc0d74c2c70a2af30c33e9e1c5a787fb79ac5c2c
27.344716
mit
3
7.616
true
false
false
true
0.654894
0.585493
58.549316
0.477323
26.279158
0.346677
34.667674
0.30453
7.270694
0.419208
11.067708
0.336104
26.233747
false
false
2025-02-03
2025-03-07
1
TIGER-Lab/AceCodeRM-7B (Merge)
TIGER-Lab_AceCoder-Qwen2.5-7B-Ins-Rule_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/TIGER-Lab/AceCoder-Qwen2.5-7B-Ins-Rule" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TIGER-Lab/AceCoder-Qwen2.5-7B-Ins-Rule</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TIGER-Lab__AceCoder-Qwen2.5-7B-Ins-Rule-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TIGER-Lab/AceCoder-Qwen2.5-7B-Ins-Rule
aedbaf4b30d6992872f6de21416fbf9c52795a81
35.109899
mit
1
7.616
true
false
false
true
0.647757
0.742413
74.241346
0.540443
35.040756
0.499245
49.924471
0.301174
6.823266
0.398031
7.720573
0.432181
36.908983
false
false
2025-02-04
2025-03-07
1
TIGER-Lab/AceCoder-Qwen2.5-7B-Ins-Rule (Merge)
TIGER-Lab_AceCoder-Qwen2.5-Coder-7B-Base-Rule_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/TIGER-Lab/AceCoder-Qwen2.5-Coder-7B-Base-Rule" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TIGER-Lab/AceCoder-Qwen2.5-Coder-7B-Base-Rule</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TIGER-Lab__AceCoder-Qwen2.5-Coder-7B-Base-Rule-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TIGER-Lab/AceCoder-Qwen2.5-Coder-7B-Base-Rule
352bab9841e39d359c630c61b46e58b2dea73384
21.333371
mit
1
7.616
true
false
false
true
0.975605
0.440763
44.076273
0.490238
29.405351
0.201662
20.166163
0.271812
2.908277
0.344885
0.94401
0.374501
30.500148
false
false
2025-02-04
2025-03-07
1
TIGER-Lab/AceCoder-Qwen2.5-Coder-7B-Base-Rule (Merge)
TIGER-Lab_AceCoder-Qwen2.5-Coder-7B-Ins-Rule_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/TIGER-Lab/AceCoder-Qwen2.5-Coder-7B-Ins-Rule" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TIGER-Lab/AceCoder-Qwen2.5-Coder-7B-Ins-Rule</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TIGER-Lab__AceCoder-Qwen2.5-Coder-7B-Ins-Rule-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TIGER-Lab/AceCoder-Qwen2.5-Coder-7B-Ins-Rule
b230c078dfebe25af64dff924d8c41e620770ec4
28.02996
mit
0
7.616
true
false
false
true
0.641851
0.622238
62.223788
0.508924
30.542991
0.360272
36.02719
0.277685
3.691275
0.404635
8.71276
0.342836
26.981752
false
false
2025-02-04
2025-03-07
1
TIGER-Lab/AceCoder-Qwen2.5-Coder-7B-Ins-Rule (Merge)
TIGER-Lab_MAmmoTH2-7B-Plus_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/TIGER-Lab/MAmmoTH2-7B-Plus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TIGER-Lab/MAmmoTH2-7B-Plus</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TIGER-Lab__MAmmoTH2-7B-Plus-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TIGER-Lab/MAmmoTH2-7B-Plus
3ed578d8dda09787137e363a0dc32e3a8ed908de
21.633508
mit
7
7.242
true
false
false
true
1.105327
0.557466
55.746641
0.423469
18.925953
0.185801
18.58006
0.280201
4.026846
0.412354
10.110938
0.301695
22.410609
false
false
2024-05-06
2024-06-27
0
TIGER-Lab/MAmmoTH2-7B-Plus
TIGER-Lab_Qwen2.5-Math-7B-CFT_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/TIGER-Lab/Qwen2.5-Math-7B-CFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TIGER-Lab/Qwen2.5-Math-7B-CFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TIGER-Lab__Qwen2.5-Math-7B-CFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TIGER-Lab/Qwen2.5-Math-7B-CFT
070621bc59d17068cc9e86b7e9f3db3efb08c981
23.521464
apache-2.0
8
7.616
true
false
false
true
1.170452
0.277698
27.769762
0.463694
24.585138
0.557402
55.740181
0.286074
4.809843
0.388667
6.616667
0.294465
21.607196
false
false
2025-01-30
2025-03-07
1
TIGER-Lab/Qwen2.5-Math-7B-CFT (Merge)
TTTXXX01_Mistral-7B-Base-SimPO2-5e-7_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/TTTXXX01/Mistral-7B-Base-SimPO2-5e-7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TTTXXX01/Mistral-7B-Base-SimPO2-5e-7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TTTXXX01__Mistral-7B-Base-SimPO2-5e-7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TTTXXX01/Mistral-7B-Base-SimPO2-5e-7
7a271e3061165f4e1abfe26715c04e20c2ac935e
16.417453
apache-2.0
0
7.242
true
false
false
true
1.045992
0.439189
43.918913
0.431955
20.692627
0.026435
2.643505
0.297819
6.375839
0.360417
5.252083
0.276596
19.621749
false
false
2024-08-30
2024-09-01
2
mistralai/Mistral-7B-v0.1
Tarek07_Progenitor-V1.1-LLaMa-70B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Tarek07/Progenitor-V1.1-LLaMa-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Tarek07/Progenitor-V1.1-LLaMa-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Tarek07__Progenitor-V1.1-LLaMa-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Tarek07/Progenitor-V1.1-LLaMa-70B
00b611cd032a7f944267d5eac9dee0e488e6428b
43.002945
llama3.3
10
70.554
true
false
false
false
28.145173
0.690606
69.060648
0.697112
56.24697
0.357251
35.725076
0.458054
27.740492
0.473563
19.628646
0.546543
49.615839
true
false
2025-01-24
2025-01-25
1
Tarek07/Progenitor-V1.1-LLaMa-70B (Merge)
Tarek07_Thalassic-Alpha-LLaMa-70B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Tarek07/Thalassic-Alpha-LLaMa-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Tarek07/Thalassic-Alpha-LLaMa-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Tarek07__Thalassic-Alpha-LLaMa-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Tarek07/Thalassic-Alpha-LLaMa-70B
134030081c61d1e1cb9df44521ab130396607682
42.220376
llama3.3
1
70.554
true
false
false
false
28.772456
0.700348
70.034841
0.694041
55.954125
0.314955
31.495468
0.443792
25.838926
0.480198
20.72474
0.543467
49.274158
true
false
2025-01-27
2025-01-28
1
Tarek07/Thalassic-Alpha-LLaMa-70B (Merge)
TeeZee_DoubleBagel-57B-v1.0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/TeeZee/DoubleBagel-57B-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TeeZee/DoubleBagel-57B-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TeeZee__DoubleBagel-57B-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TeeZee/DoubleBagel-57B-v1.0
6e10dc1fb5223d1b045dc2a19c9c267a574e520f
8.707748
apache-2.0
1
56.703
true
false
false
true
18.737295
0.233633
23.363343
0.325079
5.522782
0.009819
0.981873
0.276007
3.467562
0.43149
13.602865
0.147773
5.308067
true
false
2024-08-05
2024-08-10
1
TeeZee/DoubleBagel-57B-v1.0 (Merge)
Telugu-LLM-Labs_Indic-gemma-2b-finetuned-sft-Navarasa-2.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Telugu-LLM-Labs/Indic-gemma-2b-finetuned-sft-Navarasa-2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Telugu-LLM-Labs/Indic-gemma-2b-finetuned-sft-Navarasa-2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Telugu-LLM-Labs__Indic-gemma-2b-finetuned-sft-Navarasa-2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Telugu-LLM-Labs/Indic-gemma-2b-finetuned-sft-Navarasa-2.0
6e53b24f9368eaf0b1f9aee0c7c59f2068d05a27
6.657818
other
23
2.506
true
false
false
false
1.410326
0.210303
21.030311
0.324088
6.021055
0.02719
2.719033
0.243289
0
0.389938
7.075521
0.127909
3.10099
false
false
2024-03-17
2025-02-06
1
google/gemma-2b
Telugu-LLM-Labs_Indic-gemma-7b-finetuned-sft-Navarasa-2.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Telugu-LLM-Labs/Indic-gemma-7b-finetuned-sft-Navarasa-2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Telugu-LLM-Labs/Indic-gemma-7b-finetuned-sft-Navarasa-2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Telugu-LLM-Labs__Indic-gemma-7b-finetuned-sft-Navarasa-2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Telugu-LLM-Labs/Indic-gemma-7b-finetuned-sft-Navarasa-2.0
84d251f088d2954561a4348883ba28f6f3265182
13.004828
other
18
8.538
true
false
false
false
1.861161
0.323684
32.368449
0.402299
16.263181
0.02568
2.567976
0.270134
2.684564
0.408323
9.140365
0.23504
15.004433
false
false
2024-03-17
2025-02-06
1
google/gemma-7b
TencentARC_LLaMA-Pro-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/TencentARC/LLaMA-Pro-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TencentARC/LLaMA-Pro-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TencentARC__LLaMA-Pro-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TencentARC/LLaMA-Pro-8B
7115e7179060e0623d1ee9ff4476faed7e478d8c
8.816699
llama2
171
8.357
true
false
false
false
95.615467
0.227714
22.771358
0.34842
9.29395
0.018882
1.888218
0.260067
1.342282
0.401812
8.593229
0.1811
9.011155
false
true
2024-01-05
2024-06-12
0
TencentARC/LLaMA-Pro-8B
TencentARC_LLaMA-Pro-8B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/TencentARC/LLaMA-Pro-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TencentARC/LLaMA-Pro-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TencentARC__LLaMA-Pro-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TencentARC/LLaMA-Pro-8B-Instruct
9850c8afce19a69d8fc4a1603a82441157514016
15.28346
llama2
62
8.357
true
false
false
true
6.210407
0.448606
44.860636
0.422421
19.485726
0.024924
2.492447
0.274329
3.243848
0.419021
11.110938
0.194564
10.507166
false
true
2024-01-06
2024-06-12
0
TencentARC/LLaMA-Pro-8B-Instruct
TencentARC_MetaMath-Mistral-Pro_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/TencentARC/MetaMath-Mistral-Pro" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TencentARC/MetaMath-Mistral-Pro</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TencentARC__MetaMath-Mistral-Pro-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TencentARC/MetaMath-Mistral-Pro
3835d38de15ed2a04c32aca879b782fc50e390bf
12.516527
apache-2.0
5
8.987
true
false
false
false
1.201505
0.211877
21.187671
0.441316
22.372279
0.076284
7.628399
0.269295
2.572707
0.352417
4.985417
0.247174
16.352689
false
true
2024-02-26
2024-06-12
0
TencentARC/MetaMath-Mistral-Pro
TencentARC_Mistral_Pro_8B_v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/TencentARC/Mistral_Pro_8B_v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TencentARC/Mistral_Pro_8B_v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TencentARC__Mistral_Pro_8B_v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TencentARC/Mistral_Pro_8B_v0.1
366f159fc5b314ba2a955209d2bca4600f84dac0
14.195346
apache-2.0
66
8.987
true
false
false
false
1.264965
0.211452
21.145228
0.452598
22.894189
0.056647
5.664653
0.280201
4.026846
0.424229
11.828646
0.276513
19.612515
false
true
2024-02-22
2024-06-12
0
TencentARC/Mistral_Pro_8B_v0.1
TheDrummer_Cydonia-22B-v1.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/TheDrummer/Cydonia-22B-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheDrummer/Cydonia-22B-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheDrummer__Cydonia-22B-v1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheDrummer/Cydonia-22B-v1.2
acd8da5efadc7dc404bb4eeebef2b27b1554a2ca
28.790088
other
42
22.247
true
false
false
false
3.257409
0.563511
56.351148
0.580856
39.932604
0.203172
20.317221
0.330537
10.738255
0.402177
10.505469
0.414063
34.895833
false
false
2024-10-07
2024-10-26
0
TheDrummer/Cydonia-22B-v1.2
TheDrummer_Gemmasutra-9B-v1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/TheDrummer/Gemmasutra-9B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheDrummer/Gemmasutra-9B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheDrummer__Gemmasutra-9B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheDrummer/Gemmasutra-9B-v1
21591f6a0140e095f1c6668ac7a267f214547609
22.748685
25
10.159
false
false
false
false
5.807637
0.241551
24.155131
0.588691
41.200396
0.083082
8.308157
0.310403
8.053691
0.484594
20.940885
0.404505
33.83385
false
false
2024-07-17
2024-09-19
1
TheDrummer/Gemmasutra-9B-v1 (Merge)
TheDrummer_Gemmasutra-Mini-2B-v1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/TheDrummer/Gemmasutra-Mini-2B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheDrummer/Gemmasutra-Mini-2B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheDrummer__Gemmasutra-Mini-2B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheDrummer/Gemmasutra-Mini-2B-v1
c1db4c8f975d3848edbdaf851217039c8dfdaeb5
9.129293
other
58
2.614
true
false
false
true
2.795909
0.254866
25.486598
0.357502
9.810336
0.037764
3.776435
0.270973
2.796421
0.348979
1.189062
0.205452
11.716903
false
false
2024-08-03
2024-10-28
0
TheDrummer/Gemmasutra-Mini-2B-v1
TheDrummer_Llama-3SOME-8B-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/TheDrummer/Llama-3SOME-8B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheDrummer/Llama-3SOME-8B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheDrummer__Llama-3SOME-8B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheDrummer/Llama-3SOME-8B-v2
2412c897532c1ab325ddf674c62004b234f2939e
21.812967
cc-by-nc-4.0
41
8.03
true
false
false
true
1.498506
0.450805
45.080498
0.520335
31.695273
0.093656
9.365559
0.302013
6.935123
0.383271
7.208854
0.375332
30.592494
false
false
2024-06-21
2025-01-12
0
TheDrummer/Llama-3SOME-8B-v2
TheDrummer_Ministrations-8B-v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/TheDrummer/Ministrations-8B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheDrummer/Ministrations-8B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheDrummer__Ministrations-8B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheDrummer/Ministrations-8B-v1
39b892de64401ec7990ebb816c4455ba4532bafb
21.290452
other
19
8.02
true
false
false
false
1.725112
0.282193
28.219347
0.487663
26.985637
0.18429
18.429003
0.324664
9.955257
0.444906
14.779948
0.364362
29.373522
false
false
2024-11-07
2024-11-14
0
TheDrummer/Ministrations-8B-v1
TheDrummer_Rocinante-12B-v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/TheDrummer/Rocinante-12B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheDrummer/Rocinante-12B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheDrummer__Rocinante-12B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheDrummer/Rocinante-12B-v1
74a4ae2584d45655298995198d5ab3e660364a1a
24.628093
other
28
12.248
true
false
false
true
3.728883
0.60765
60.764992
0.506545
30.025654
0.126888
12.688822
0.291107
5.480984
0.401719
11.28151
0.347739
27.526596
false
false
2024-08-14
2024-09-03
0
TheDrummer/Rocinante-12B-v1
TheDrummer_Tiger-Gemma-9B-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/TheDrummer/Tiger-Gemma-9B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheDrummer/Tiger-Gemma-9B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheDrummer__Tiger-Gemma-9B-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheDrummer/Tiger-Gemma-9B-v1
e95392c07bab3c483937583c711939ab3f5044dd
30.896644
39
9.242
true
false
false
true
3.240855
0.72815
72.81502
0.570369
37.220546
0.183535
18.353474
0.338926
11.856823
0.416167
10.4875
0.411818
34.646498
false
false
2024-07-12
2025-01-07
0
TheDrummer/Tiger-Gemma-9B-v1
TheDrummer_Tiger-Gemma-9B-v2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/TheDrummer/Tiger-Gemma-9B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheDrummer/Tiger-Gemma-9B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheDrummer__Tiger-Gemma-9B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheDrummer/Tiger-Gemma-9B-v2
9aea74832c16646c9c4948ccc2e76cb812f3c089
29.900202
29
9.242
true
false
false
true
3.339598
0.6986
69.859972
0.561719
35.469541
0.182024
18.202417
0.339765
11.96868
0.408417
9.31875
0.411237
34.581856
false
false
2024-07-16
2025-01-07
0
TheDrummer/Tiger-Gemma-9B-v2
TheDrummer_Tiger-Gemma-9B-v3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/TheDrummer/Tiger-Gemma-9B-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheDrummer/Tiger-Gemma-9B-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheDrummer__Tiger-Gemma-9B-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheDrummer/Tiger-Gemma-9B-v3
fe32c1926e4057f75ebc2a4a57103564168cdbb7
29.473275
47
9.242
true
false
false
true
3.143506
0.682064
68.206359
0.581223
38.836023
0.162387
16.238671
0.338926
11.856823
0.400354
7.710937
0.405918
33.990839
false
false
2024-10-04
2025-01-07
0
TheDrummer/Tiger-Gemma-9B-v3
TheDrunkenSnail_Daughter-of-Rhodia-12B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/TheDrunkenSnail/Daughter-of-Rhodia-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheDrunkenSnail/Daughter-of-Rhodia-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheDrunkenSnail__Daughter-of-Rhodia-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheDrunkenSnail/Daughter-of-Rhodia-12B
f8f7d64218491055f0c983736e0befc6fbe92a63
27.609503
0
12.248
false
false
false
true
1.621451
0.690382
69.038152
0.517917
31.475833
0.122356
12.23565
0.317114
8.948546
0.434771
14.613021
0.364112
29.345819
false
false
2025-01-17
0
Removed
TheDrunkenSnail_Mother-of-Rhodia-12B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/TheDrunkenSnail/Mother-of-Rhodia-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheDrunkenSnail/Mother-of-Rhodia-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheDrunkenSnail__Mother-of-Rhodia-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheDrunkenSnail/Mother-of-Rhodia-12B
82376ee83b88faa62921f98d28ece7c0941cfda2
25.379307
other
1
12.248
true
false
false
true
1.71481
0.65049
65.048959
0.494791
28.502979
0.122356
12.23565
0.298658
6.487696
0.412417
11.652083
0.355136
28.348478
true
false
2025-01-23
2025-01-27
1
TheDrunkenSnail/Mother-of-Rhodia-12B (Merge)
TheDrunkenSnail_Son-of-Rhodia_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/TheDrunkenSnail/Son-of-Rhodia" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheDrunkenSnail/Son-of-Rhodia</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheDrunkenSnail__Son-of-Rhodia-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheDrunkenSnail/Son-of-Rhodia
f855ee46e8c6f187e2885bccdb4dd40a4ec27d94
27.216225
other
3
12.248
true
false
false
true
1.91242
0.704645
70.464479
0.509733
30.222057
0.13142
13.141994
0.312919
8.389262
0.420292
12.103125
0.360788
28.976433
true
false
2024-12-31
2024-12-31
1
TheDrunkenSnail/Son-of-Rhodia (Merge)
TheHierophant_Underground-Cognitive-V0.3-test_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/TheHierophant/Underground-Cognitive-V0.3-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheHierophant/Underground-Cognitive-V0.3-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheHierophant__Underground-Cognitive-V0.3-test-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheHierophant/Underground-Cognitive-V0.3-test
2753b6f9068ad14efe836cde3160747cd208bf9e
22.406091
0
10.732
false
false
false
false
1.173162
0.48083
48.082975
0.529013
33.665102
0.058912
5.891239
0.298658
6.487696
0.435115
14.55599
0.331782
25.753546
false
false
2024-11-22
2024-11-22
1
TheHierophant/Underground-Cognitive-V0.3-test (Merge)
TheTsar1209_nemo-carpmuscle-v0.1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/TheTsar1209/nemo-carpmuscle-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheTsar1209/nemo-carpmuscle-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheTsar1209__nemo-carpmuscle-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheTsar1209/nemo-carpmuscle-v0.1
84d20db8220014958ff157047b2216910637ae39
16.794489
apache-2.0
1
12.248
true
false
false
false
3.616881
0.227564
22.756397
0.508353
30.034996
0.047583
4.758308
0.29698
6.263982
0.4135
10.220833
0.340592
26.732417
false
false
2024-08-15
2024-10-10
1
unsloth/Mistral-Nemo-Base-2407-bnb-4bit
TheTsar1209_qwen-carpmuscle-r-v0.3_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/TheTsar1209/qwen-carpmuscle-r-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheTsar1209/qwen-carpmuscle-r-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheTsar1209__qwen-carpmuscle-r-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheTsar1209/qwen-carpmuscle-r-v0.3
30f8221d2f5f587343b1dbd65cf7d9bda4f5ef16
32.000497
1
14.766
false
false
false
true
4.513993
0.445509
44.550903
0.622712
46.375914
0.300604
30.060423
0.350671
13.422819
0.42776
12.003385
0.510306
45.589539
false
false
2024-10-23
2024-10-23
1
TheTsar1209/qwen-carpmuscle-r-v0.3 (Merge)
TheTsar1209_qwen-carpmuscle-v0.1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/TheTsar1209/qwen-carpmuscle-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheTsar1209/qwen-carpmuscle-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheTsar1209__qwen-carpmuscle-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheTsar1209/qwen-carpmuscle-v0.1
7c7b06a1788aef48054c3c6d6ad90c6dc5264a81
33.445029
apache-2.0
0
14.77
true
false
false
true
4.352435
0.562163
56.216284
0.64343
48.825595
0.26284
26.283988
0.34396
12.527964
0.416104
10.146354
0.52003
46.669991
false
false
2024-10-05
2024-10-10
3
Qwen/Qwen2.5-14B
TheTsar1209_qwen-carpmuscle-v0.2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/TheTsar1209/qwen-carpmuscle-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheTsar1209/qwen-carpmuscle-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheTsar1209__qwen-carpmuscle-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheTsar1209/qwen-carpmuscle-v0.2
081f6b067ebca9bc384af283f1d267880534b8e3
33.666713
apache-2.0
0
14.77
true
false
false
true
4.496398
0.525693
52.569294
0.638692
48.182441
0.283233
28.323263
0.355705
14.09396
0.434552
12.752344
0.514711
46.078975
false
false
2024-10-16
2024-10-19
3
Qwen/Qwen2.5-14B
TheTsar1209_qwen-carpmuscle-v0.3_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/TheTsar1209/qwen-carpmuscle-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheTsar1209/qwen-carpmuscle-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheTsar1209__qwen-carpmuscle-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheTsar1209/qwen-carpmuscle-v0.3
ec92820e4ff36b6f21e1ef63546fe2ddcb34456a
31.794404
apache-2.0
0
14.77
true
false
false
true
8.461958
0.447632
44.763228
0.615153
45.543392
0.313444
31.344411
0.356544
14.205817
0.413188
9.781771
0.50615
45.127807
false
false
2024-10-28
2024-10-28
2
Qwen/Qwen2.5-14B
TheTsar1209_qwen-carpmuscle-v0.4_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/TheTsar1209/qwen-carpmuscle-v0.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheTsar1209/qwen-carpmuscle-v0.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheTsar1209__qwen-carpmuscle-v0.4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheTsar1209/qwen-carpmuscle-v0.4
3e11d5aad0f19bd652b8605620d0cf6af7a0ea00
37.39396
apache-2.0
1
14.77
true
false
false
true
2.739248
0.720207
72.020683
0.645367
49.384956
0.27719
27.719033
0.352349
13.646532
0.451604
15.550521
0.514378
46.042036
false
false
2024-11-18
2024-11-18
3
Qwen/Qwen2.5-14B
TheTsar1209_qwen-carpmuscle-v0.4.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/TheTsar1209/qwen-carpmuscle-v0.4.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TheTsar1209/qwen-carpmuscle-v0.4.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TheTsar1209__qwen-carpmuscle-v0.4.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TheTsar1209/qwen-carpmuscle-v0.4.1
052e690c01133ef4cfaa1457426679fe7effccda
37.605164
apache-2.0
0
14.77
true
false
false
true
2.678118
0.735994
73.599383
0.650653
50.003673
0.277946
27.794562
0.345638
12.751678
0.448906
14.913281
0.519116
46.56841
false
false
2025-01-19
2025-01-19
3
Qwen/Qwen2.5-14B
Tijmen2_cosmosage-v3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Tijmen2/cosmosage-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Tijmen2/cosmosage-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Tijmen2__cosmosage-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Tijmen2/cosmosage-v3
e6d4b4e6868fcf113ab5261d71c7214a1f7fbb0c
17.354747
mit
1
8.03
true
false
false
true
1.661718
0.448232
44.82318
0.455064
22.687106
0.050604
5.060423
0.282718
4.362416
0.419885
10.685677
0.248587
16.509678
false
false
2024-06-20
2024-08-27
1
meta-llama/Meta-Llama-3-8B
TinyLlama_TinyLlama-1.1B-Chat-v0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/TinyLlama/TinyLlama-1.1B-Chat-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TinyLlama/TinyLlama-1.1B-Chat-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TinyLlama__TinyLlama-1.1B-Chat-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TinyLlama/TinyLlama-1.1B-Chat-v0.1
7abc14e7779eabc3a028bc695342869d0410dea2
3.957577
apache-2.0
55
1.1
true
false
false
false
0.18219
0.147854
14.785436
0.308353
3.363011
0.006042
0.60423
0.229027
0
0.35924
3.904948
0.109791
1.08784
false
true
2023-09-16
2024-12-02
0
TinyLlama/TinyLlama-1.1B-Chat-v0.1
TinyLlama_TinyLlama-1.1B-Chat-v0.5_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/TinyLlama/TinyLlama-1.1B-Chat-v0.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TinyLlama/TinyLlama-1.1B-Chat-v0.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TinyLlama__TinyLlama-1.1B-Chat-v0.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TinyLlama/TinyLlama-1.1B-Chat-v0.5
5c9e70dd07f5234bf6bf6a2425fffeecd5a6020b
4.126164
apache-2.0
8
1.1
true
false
false
false
0.189928
0.163367
16.336653
0.310505
3.407691
0.003776
0.377644
0.248322
0
0.366125
3.565625
0.109624
1.069371
false
true
2023-11-20
2024-10-23
0
TinyLlama/TinyLlama-1.1B-Chat-v0.5
TinyLlama_TinyLlama-1.1B-Chat-v0.6_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/TinyLlama/TinyLlama-1.1B-Chat-v0.6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TinyLlama/TinyLlama-1.1B-Chat-v0.6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TinyLlama__TinyLlama-1.1B-Chat-v0.6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TinyLlama/TinyLlama-1.1B-Chat-v0.6
bf9ae1c8bf026667e6f810768de259bb4a7f4777
4.294276
apache-2.0
98
1.1
true
false
false
true
0.647695
0.157421
15.74212
0.306698
3.390371
0.015861
1.586103
0.258389
1.118568
0.342219
2.277344
0.11486
1.651152
false
true
2023-11-20
2024-10-23
0
TinyLlama/TinyLlama-1.1B-Chat-v0.6
TinyLlama_TinyLlama-1.1B-Chat-v1.0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/TinyLlama/TinyLlama-1.1B-Chat-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TinyLlama/TinyLlama-1.1B-Chat-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TinyLlama__TinyLlama-1.1B-Chat-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TinyLlama/TinyLlama-1.1B-Chat-v1.0
fe8a4ea1ffedaf415f4da2f062534de366a451e6
2.818859
apache-2.0
1,193
1.1
true
false
false
false
0.536883
0.059576
5.957637
0.310356
4.013397
0.015106
1.510574
0.25
0
0.351521
4.306771
0.110123
1.124778
false
true
2023-12-30
2024-08-04
0
TinyLlama/TinyLlama-1.1B-Chat-v1.0
TinyLlama_TinyLlama-1.1B-intermediate-step-1431k-3T_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TinyLlama__TinyLlama-1.1B-intermediate-step-1431k-3T-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T
59f6f375b26bde864a6ca194a9a3044570490064
5.230318
apache-2.0
171
1.1
true
false
false
false
0.331596
0.227664
22.766371
0.307119
3.547093
0.012085
1.208459
0.252517
0.33557
0.338031
2.18724
0.112035
1.337175
false
true
2023-12-28
2024-11-27
0
TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T
TinyLlama_TinyLlama_v1.1_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/TinyLlama/TinyLlama_v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TinyLlama/TinyLlama_v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/TinyLlama__TinyLlama_v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TinyLlama/TinyLlama_v1.1
ff3c701f2424c7625fdefb9dd470f45ef18b02d6
4.824554
apache-2.0
87
1.1
true
false
false
false
0.497857
0.200061
20.006139
0.30237
3.210301
0.012085
1.208459
0.245805
0
0.369969
3.979427
0.104887
0.542996
false
true
2024-03-09
2024-06-12
0
TinyLlama/TinyLlama_v1.1
ToastyPigeon_Sto-vo-kor-12B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ToastyPigeon/Sto-vo-kor-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ToastyPigeon/Sto-vo-kor-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ToastyPigeon__Sto-vo-kor-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ToastyPigeon/Sto-vo-kor-12B
ee703d00350f35eeb68cbad28a4e9a4fcb30fde3
22.997938
apache-2.0
3
12.248
true
false
false
true
1.737156
0.550123
55.012256
0.506462
29.579484
0.108761
10.876133
0.305369
7.38255
0.393844
8.497135
0.339761
26.640071
false
false
2025-01-21
2025-01-22
1
ToastyPigeon/Sto-vo-kor-12B (Merge)
Trappu_Magnum-Picaro-0.7-v2-12b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Trappu/Magnum-Picaro-0.7-v2-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Trappu/Magnum-Picaro-0.7-v2-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Trappu__Magnum-Picaro-0.7-v2-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Trappu/Magnum-Picaro-0.7-v2-12b
2ffc46cde49eb823f5588990bd6b848cd505271e
21.730064
apache-2.0
7
12.248
true
false
false
false
3.349918
0.300279
30.027882
0.550666
35.746233
0.066465
6.646526
0.322987
9.731544
0.472719
19.55651
0.358045
28.67169
true
false
2024-09-11
2024-09-12
1
Trappu/Magnum-Picaro-0.7-v2-12b (Merge)
Trappu_Nemo-Picaro-12B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Trappu/Nemo-Picaro-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Trappu/Nemo-Picaro-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Trappu__Nemo-Picaro-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Trappu/Nemo-Picaro-12B
d65bf383d744998ae93a5589ec886532bb7e18eb
21.362493
apache-2.0
2
12.248
true
false
false
false
3.682056
0.257714
25.771398
0.548959
35.973135
0.084592
8.459215
0.327181
10.290828
0.472594
18.740885
0.360455
28.939495
false
false
2024-09-10
2024-09-22
2
royallab/MN-LooseCannon-12B-v2 (Merge)
Tremontaine_L3-12B-Lunaris-v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Tremontaine/L3-12B-Lunaris-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Tremontaine/L3-12B-Lunaris-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Tremontaine__L3-12B-Lunaris-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Tremontaine/L3-12B-Lunaris-v1
7be236530a835416ebca712d51d661c4488a45de
25.477255
2
11.52
false
false
false
true
2.281928
0.690931
69.093117
0.523022
32.180746
0.087613
8.761329
0.309564
7.941834
0.367365
4.053906
0.377493
30.832595
false
false
2024-07-14
2024-07-15
1
Tremontaine/L3-12B-Lunaris-v1 (Merge)
Triangle104_Annunaki-12b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Annunaki-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Annunaki-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Annunaki-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Annunaki-12b
70a240d771e5ec614b1cd9f080636cec5d9b4ae5
23.369698
apache-2.0
1
12.248
true
false
false
false
1.573334
0.387207
38.720706
0.549897
35.321145
0.121601
12.160121
0.321309
9.50783
0.440875
14.276042
0.372091
30.232343
true
false
2025-01-19
2025-01-27
1
Triangle104/Annunaki-12b (Merge)
Triangle104_BigTalker-Lite-8B_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/BigTalker-Lite-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/BigTalker-Lite-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__BigTalker-Lite-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/BigTalker-Lite-8B
ea4454ee9f66e54cb0f9efc87a702048582cafb7
20.978993
llama3.1
1
8.03
true
false
false
false
1.335416
0.368922
36.892224
0.530814
32.683408
0.101964
10.196375
0.310403
8.053691
0.420844
11.038802
0.343085
27.009456
true
false
2024-12-06
2025-01-30
1
Triangle104/BigTalker-Lite-8B (Merge)
Triangle104_Chatty-Harry_V2.0_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Chatty-Harry_V2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Chatty-Harry_V2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Chatty-Harry_V2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Chatty-Harry_V2.0
79d69071632b800de651dffd244d3036d0aeb0c4
21.833191
apache-2.0
2
12.248
true
false
false
false
1.71688
0.332552
33.255207
0.531893
32.763032
0.138973
13.897281
0.322987
9.731544
0.407823
11.544531
0.368268
29.80755
true
false
2024-10-27
2025-01-27
1
Triangle104/Chatty-Harry_V2.0 (Merge)
Triangle104_Chatty-Harry_V3.0_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Chatty-Harry_V3.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Chatty-Harry_V3.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Chatty-Harry_V3.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Chatty-Harry_V3.0
c16d5698bc8c96365d4874b2bb207e4baa79b8ea
23.114767
apache-2.0
3
12.248
true
false
false
false
1.667963
0.367498
36.749824
0.552619
35.894707
0.112538
11.253776
0.322987
9.731544
0.440844
15.038802
0.37018
30.019947
true
false
2024-11-20
2025-01-27
1
Triangle104/Chatty-Harry_V3.0 (Merge)
Triangle104_Chronos-Prism_V1.0_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Chronos-Prism_V1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Chronos-Prism_V1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Chronos-Prism_V1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Chronos-Prism_V1.0
02889fdec891a9e092c7c0ab4c3b8562117df700
22.183415
1
12.248
false
false
false
false
1.654034
0.325933
32.593297
0.555419
36.575709
0.120091
12.009063
0.309564
7.941834
0.426271
14.283854
0.367271
29.696735
false
false
2024-11-24
2025-01-30
1
Triangle104/Chronos-Prism_V1.0 (Merge)
Triangle104_DS-Distilled-Hermes-Llama-3.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/DS-Distilled-Hermes-Llama-3.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/DS-Distilled-Hermes-Llama-3.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__DS-Distilled-Hermes-Llama-3.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/DS-Distilled-Hermes-Llama-3.1
48b1a24bcfff16ae849d2003f7773fa7e241d332
22.385009
1
8.03
false
false
false
false
1.42468
0.322935
32.293537
0.511701
30.312466
0.293051
29.305136
0.318792
9.17226
0.403854
9.781771
0.311004
23.444888
false
false
2025-01-26
2025-01-27
1
Triangle104/DS-Distilled-Hermes-Llama-3.1 (Merge)
Triangle104_DS-Distilled-Hermes-Llama-3.1_TIES_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/DS-Distilled-Hermes-Llama-3.1_TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/DS-Distilled-Hermes-Llama-3.1_TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__DS-Distilled-Hermes-Llama-3.1_TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/DS-Distilled-Hermes-Llama-3.1_TIES
f17071ff432257711720cc3956469fe37391ef81
3.378416
llama3.1
1
8.03
true
false
false
false
1.505849
0.136414
13.64136
0.292845
2.108593
0.009063
0.906344
0.244966
0
0.362094
2.461719
0.110372
1.152482
true
false
2025-01-26
2025-01-30
1
Triangle104/DS-Distilled-Hermes-Llama-3.1_TIES (Merge)
Triangle104_DS-R1-Distill-Q2.5-10B-Harmony_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/DS-R1-Distill-Q2.5-10B-Harmony" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/DS-R1-Distill-Q2.5-10B-Harmony</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__DS-R1-Distill-Q2.5-10B-Harmony-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/DS-R1-Distill-Q2.5-10B-Harmony
010d50098f18672d302219150ce522e8e37eaf1e
3.758424
apache-2.0
0
10.366
true
false
false
false
1.871262
0.175082
17.508212
0.264328
1.528324
0
0
0.21057
0
0.31276
1.595052
0.117271
1.918957
true
false
2025-01-27
2025-01-27
1
Triangle104/DS-R1-Distill-Q2.5-10B-Harmony (Merge)
Triangle104_DS-R1-Distill-Q2.5-14B-Harmony_V0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/DS-R1-Distill-Q2.5-14B-Harmony_V0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/DS-R1-Distill-Q2.5-14B-Harmony_V0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__DS-R1-Distill-Q2.5-14B-Harmony_V0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/DS-R1-Distill-Q2.5-14B-Harmony_V0.1
baa7a816d101fd6a8d7909f4fdf038538e33822c
38.406333
apache-2.0
1
14.77
true
false
false
false
3.679393
0.451504
45.150423
0.578338
38.71537
0.555136
55.513595
0.393456
19.127517
0.556688
31.919271
0.460106
40.01182
true
false
2025-01-28
2025-01-28
1
Triangle104/DS-R1-Distill-Q2.5-14B-Harmony_V0.1 (Merge)
Triangle104_DS-R1-Distill-Q2.5-7B-RP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/DS-R1-Distill-Q2.5-7B-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/DS-R1-Distill-Q2.5-7B-RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__DS-R1-Distill-Q2.5-7B-RP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/DS-R1-Distill-Q2.5-7B-RP
1dce22130cb198a763c59cfe2bcf3a3ac12b4fab
23.291514
apache-2.0
0
7.616
true
false
false
false
1.333267
0.344542
34.454248
0.438349
20.781374
0.468278
46.827795
0.313758
8.501119
0.403021
8.177604
0.289063
21.006944
true
false
2025-01-27
2025-01-27
1
Triangle104/DS-R1-Distill-Q2.5-7B-RP (Merge)
Triangle104_DS-R1-Llama-8B-Harmony_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/DS-R1-Llama-8B-Harmony" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/DS-R1-Llama-8B-Harmony</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__DS-R1-Llama-8B-Harmony-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/DS-R1-Llama-8B-Harmony
01aee21e38112e386511e5abeee1b1d3e2f904a5
21.179062
1
8.03
false
false
false
false
1.555332
0.356633
35.663262
0.415365
17.496342
0.428248
42.824773
0.291946
5.592841
0.376198
6.12474
0.274352
19.372414
false
false
2025-01-25
2025-01-27
1
Triangle104/DS-R1-Llama-8B-Harmony (Merge)
Triangle104_DSR1-Distill-Llama-Lit-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/DSR1-Distill-Llama-Lit-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/DSR1-Distill-Llama-Lit-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__DSR1-Distill-Llama-Lit-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/DSR1-Distill-Llama-Lit-8B
b5952b3507961626e50dc5640677000cdfbc4726
17.835033
apache-2.0
0
8.03
true
false
false
false
1.4702
0.188521
18.85209
0.428406
19.22516
0.351964
35.196375
0.302852
7.04698
0.353469
6.716927
0.279754
19.972665
true
false
2025-02-09
2025-02-11
1
Triangle104/DSR1-Distill-Llama-Lit-8B (Merge)
Triangle104_DSR1-Distill-Qwen-7B-RP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/DSR1-Distill-Qwen-7B-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/DSR1-Distill-Qwen-7B-RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__DSR1-Distill-Qwen-7B-RP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/DSR1-Distill-Qwen-7B-RP
ad284db0b08b115a9fb2675cdffc2447b82ec3bc
24.099711
apache-2.0
0
7.613
true
false
false
false
0.686088
0.360929
36.0929
0.432649
19.853298
0.480363
48.036254
0.319631
9.284116
0.404542
8.801042
0.302776
22.530659
true
false
2025-02-09
2025-02-09
1
Triangle104/DSR1-Distill-Qwen-7B-RP (Merge)
Triangle104_Dark-Chivalry_V1.0_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Dark-Chivalry_V1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Dark-Chivalry_V1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Dark-Chivalry_V1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Dark-Chivalry_V1.0
94882aa3e066ac9a896b57eaf026df23e930906e
21.67295
apache-2.0
1
8.03
true
false
false
false
1.422058
0.43257
43.257003
0.497421
28.026139
0.13142
13.141994
0.293624
5.816555
0.418177
12.638802
0.344415
27.15721
true
false
2024-11-20
2025-01-30
1
Triangle104/Dark-Chivalry_V1.0 (Merge)
Triangle104_Distilled-DarkPlanet-Allades-8B_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Distilled-DarkPlanet-Allades-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Distilled-DarkPlanet-Allades-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Distilled-DarkPlanet-Allades-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Distilled-DarkPlanet-Allades-8B
61d7f1f5b92a156aa342c6c78d8d81d0eff96d55
21.683058
llama3.1
2
8.03
true
false
false
false
1.457406
0.346016
34.601635
0.463395
23.038206
0.400302
40.030211
0.305369
7.38255
0.35375
3.91875
0.290143
21.126995
true
false
2025-01-29
2025-01-29
1
Triangle104/Distilled-DarkPlanet-Allades-8B (Merge)
Triangle104_Distilled-DarkPlanet-Allades-8B_TIES_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Distilled-DarkPlanet-Allades-8B_TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Distilled-DarkPlanet-Allades-8B_TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Distilled-DarkPlanet-Allades-8B_TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Distilled-DarkPlanet-Allades-8B_TIES
ce441290f3193f30950e1e979a72a6d3e7be5801
20.213926
llama3.1
1
8.03
true
false
false
false
1.445747
0.389181
38.918071
0.504156
29.961797
0.090634
9.063444
0.314597
8.612975
0.386802
8.05026
0.340093
26.677009
true
false
2025-02-02
2025-02-05
1
Triangle104/Distilled-DarkPlanet-Allades-8B_TIES (Merge)
Triangle104_Distilled-Whiskey-8b_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Distilled-Whiskey-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Distilled-Whiskey-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Distilled-Whiskey-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Distilled-Whiskey-8b
959128ec0161c4d4cc424ec75e63fbcf997c3f81
22.935848
1
8.03
false
false
false
false
1.478627
0.344767
34.476744
0.502782
29.317663
0.254532
25.453172
0.331376
10.850112
0.417219
11.21901
0.336686
26.298389
false
false
2025-01-25
2025-01-27
1
Triangle104/Distilled-Whiskey-8b (Merge)
Triangle104_Dolphin3-Llama3.2-Smart_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Dolphin3-Llama3.2-Smart" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Dolphin3-Llama3.2-Smart</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Dolphin3-Llama3.2-Smart-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Dolphin3-Llama3.2-Smart
ecf28b35c75e6e4f3144701d8887e9328a42cace
14.183463
0
3.213
false
false
false
false
1.216945
0.41366
41.36602
0.397508
15.349666
0.043807
4.380665
0.269295
2.572707
0.392167
8.154167
0.219498
13.277556
false
false
2025-01-19
2025-01-27
1
Triangle104/Dolphin3-Llama3.2-Smart (Merge)
Triangle104_Gemmadevi-Stock-10B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Gemmadevi-Stock-10B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Gemmadevi-Stock-10B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Gemmadevi-Stock-10B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Gemmadevi-Stock-10B
a8085fb5e559a1ac384ab0b5bfb3599bfff6fabc
22.723742
2
10.159
false
false
false
false
3.862203
0.158195
15.81947
0.606592
43.62184
0.096677
9.667674
0.353188
13.758389
0.462115
17.23099
0.426197
36.24409
false
false
2025-02-04
2025-02-05
1
Triangle104/Gemmadevi-Stock-10B (Merge)
Triangle104_Hermes-Llama-3.2-CoT_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Hermes-Llama-3.2-CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Hermes-Llama-3.2-CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Hermes-Llama-3.2-CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Hermes-Llama-3.2-CoT
91cdadaa53f538ae95d6e59016639d10381c6da6
17.621221
llama3.2
1
3.213
true
false
false
false
1.179574
0.417757
41.775711
0.461575
23.795789
0.095166
9.516616
0.279362
3.914989
0.369781
5.089323
0.294714
21.6349
true
false
2025-01-11
2025-01-27
1
Triangle104/Hermes-Llama-3.2-CoT (Merge)
Triangle104_Hermes-Llama-3.2-CoT-Summary_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Hermes-Llama-3.2-CoT-Summary" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Hermes-Llama-3.2-CoT-Summary</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Hermes-Llama-3.2-CoT-Summary-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Hermes-Llama-3.2-CoT-Summary
92e850a4a23b8d6694026f31ee6c72c4d1b6a25d
16.766151
llama3.2
1
3.213
true
false
false
false
1.211524
0.483028
48.302836
0.42003
17.388422
0.083082
8.308157
0.255872
0.782998
0.3575
4.6875
0.290143
21.126995
true
false
2025-01-11
2025-01-27
1
Triangle104/Hermes-Llama-3.2-CoT-Summary (Merge)
Triangle104_Hermes3-L3.1-DirtyHarry-8B_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Hermes3-L3.1-DirtyHarry-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Hermes3-L3.1-DirtyHarry-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Hermes3-L3.1-DirtyHarry-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Hermes3-L3.1-DirtyHarry-8B
cf6bca48a8378975588c17238050ce687d226204
18.559822
1
8.03
false
false
false
false
1.322269
0.324234
32.423414
0.506639
29.678776
0.071752
7.175227
0.302013
6.935123
0.406896
9.161979
0.33386
25.984412
false
false
2024-12-21
2025-01-27
1
Triangle104/Hermes3-L3.1-DirtyHarry-8B (Merge)
Triangle104_Herodotos-14B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Herodotos-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Herodotos-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Herodotos-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Herodotos-14B
0fad098b9e657b0aabd073b80f72e0961cb3d4c7
38.33726
apache-2.0
1
14.77
true
false
false
false
3.835057
0.466742
46.674158
0.643504
48.909903
0.504532
50.453172
0.373322
16.442953
0.479542
19.876042
0.529006
47.667332
true
false
2025-01-11
2025-01-27
1
Triangle104/Herodotos-14B (Merge)
Triangle104_Herodotos-14B_V0.1_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Herodotos-14B_V0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Herodotos-14B_V0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Herodotos-14B_V0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Herodotos-14B_V0.1
95f343835e08e360fa8b7ffe79c1b67045952825
4.563889
apache-2.0
0
14.77
true
false
false
false
1.997502
0.187872
18.787151
0.301722
2.954726
0
0
0.223993
0
0.368385
3.814844
0.116439
1.826611
true
false
2025-02-11
2025-02-11
1
Triangle104/Herodotos-14B_V0.1 (Merge)
Triangle104_L3.1-8B-Dusky-Ink_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/L3.1-8B-Dusky-Ink" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/L3.1-8B-Dusky-Ink</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__L3.1-8B-Dusky-Ink-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/L3.1-8B-Dusky-Ink
b972ab707f97d67d52b2122f77e4be32822fc578
22.391499
1
8.03
false
false
false
false
1.440204
0.452978
45.29781
0.50979
30.509038
0.123112
12.311178
0.28943
5.257271
0.422396
11.166146
0.368268
29.80755
false
false
2025-02-02
2025-02-05
1
Triangle104/L3.1-8B-Dusky-Ink (Merge)
Triangle104_L3.1-8B-Dusky-Ink_v0.r1_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/L3.1-8B-Dusky-Ink_v0.r1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/L3.1-8B-Dusky-Ink_v0.r1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__L3.1-8B-Dusky-Ink_v0.r1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/L3.1-8B-Dusky-Ink_v0.r1
c169ef1d2274ffc8af2496a8c3f16d19ff9a5b5f
13.969324
llama3.1
0
8.03
true
false
false
false
1.467099
0.198488
19.848779
0.433728
20.175489
0.043051
4.305136
0.303691
7.158837
0.398833
7.820833
0.320562
24.506871
true
false
2025-02-08
2025-02-09
1
Triangle104/L3.1-8B-Dusky-Ink_v0.r1 (Merge)
Triangle104_LThreePointOne-8B-HermesBlackroot_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/LThreePointOne-8B-HermesBlackroot" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/LThreePointOne-8B-HermesBlackroot</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__LThreePointOne-8B-HermesBlackroot-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/LThreePointOne-8B-HermesBlackroot
52244f9da40c719f1d302e1ce87f9a07a5b96955
15.160129
apache-2.0
1
8.03
true
false
false
false
1.477824
0.179203
17.92034
0.499833
29.26725
0.019637
1.963746
0.307047
7.606264
0.358552
8.81901
0.328457
25.384161
true
false
2025-02-14
2025-02-19
1
Triangle104/LThreePointOne-8B-HermesBlackroot (Merge)
Triangle104_LThreePointOne-8B-HermesInk_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/LThreePointOne-8B-HermesInk" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/LThreePointOne-8B-HermesInk</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__LThreePointOne-8B-HermesInk-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/LThreePointOne-8B-HermesInk
b572deac44832f1e5a7bb75e33d533620aeedac6
22.696155
apache-2.0
2
8.03
true
false
false
false
2.072176
0.403119
40.311928
0.522277
31.479947
0.172205
17.220544
0.322987
9.731544
0.412938
10.017188
0.346742
27.41578
true
false
2025-02-14
2025-02-19
1
Triangle104/LThreePointOne-8B-HermesInk (Merge)
Triangle104_Llama3.1-Allades-Lit-8b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Llama3.1-Allades-Lit-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Llama3.1-Allades-Lit-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Llama3.1-Allades-Lit-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Llama3.1-Allades-Lit-8b
9821725c1b319134754d350de2071b0d5d1c8ad4
11.875473
0
8.03
false
false
false
false
0.772031
0.246124
24.612362
0.41833
17.446907
0.002266
0.226586
0.284396
4.58613
0.370833
5.220833
0.27244
19.160018
false
false
2025-02-11
2025-02-11
1
Triangle104/Llama3.1-Allades-Lit-8b (Merge)
Triangle104_Llama3.1-cc-Lit-8b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Llama3.1-cc-Lit-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Llama3.1-cc-Lit-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Llama3.1-cc-Lit-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Llama3.1-cc-Lit-8b
7285d0694395654ed0255b4ca2c7e7fa8aaad482
12.717562
apache-2.0
0
8.03
true
false
false
false
1.507734
0.299305
29.930473
0.384799
13.866972
0.003021
0.302115
0.277685
3.691275
0.385406
6.242448
0.300449
22.272089
true
false
2025-02-11
2025-02-11
1
Triangle104/Llama3.1-cc-Lit-8b (Merge)
Triangle104_Minerva-1.5b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Minerva-1.5b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Minerva-1.5b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Minerva-1.5b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Minerva-1.5b
101165d451aa5dc66514572056b1216851ac86c8
14.467081
apache-2.0
0
1.777
true
false
false
false
1.130704
0.26943
26.942956
0.402571
16.381923
0.102719
10.271903
0.310403
8.053691
0.3655
6.2875
0.269781
18.864509
true
false
2025-01-19
2025-01-27
1
Triangle104/Minerva-1.5b (Merge)
Triangle104_Minerva-1.5b_V0.2_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Minerva-1.5b_V0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Minerva-1.5b_V0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Minerva-1.5b_V0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Minerva-1.5b_V0.2
e8a05f264dd1ed568812459edeaa17e977efedee
15.02144
apache-2.0
1
1.777
true
false
false
false
1.1414
0.308347
30.834741
0.398904
14.994536
0.114048
11.404834
0.285235
4.697987
0.39601
6.967969
0.291057
21.228576
true
false
2025-01-19
2025-01-27
1
Triangle104/Minerva-1.5b_V0.2 (Merge)
Triangle104_Minerva-10b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Minerva-10b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Minerva-10b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Minerva-10b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Minerva-10b
ee678589020031634d02b020767d821dd5b10876
10.977971
apache-2.0
0
10.067
true
false
false
false
1.561338
0.187872
18.787151
0.446204
22.692483
0
0
0.28104
4.138702
0.362708
5.605208
0.231799
14.644282
true
false
2025-01-19
2025-01-27
1
Triangle104/Minerva-10b (Merge)
Triangle104_Minerva-14b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Minerva-14b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Minerva-14b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Minerva-14b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Minerva-14b
4eed798e0edb7869fadf60f254abdad75825b348
32.40842
apache-2.0
1
14.77
true
false
false
false
3.038677
0.34679
34.678985
0.630083
47.062225
0.305136
30.513595
0.374161
16.55481
0.476625
19.044792
0.519365
46.596114
true
false
2025-01-19
2025-01-27
1
Triangle104/Minerva-14b (Merge)
Triangle104_Minerva-14b-V0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Minerva-14b-V0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Minerva-14b-V0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Minerva-14b-V0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Minerva-14b-V0.1
5c4ed9ada62c298fba8705132d69382b8cf6ec6c
27.042982
1
14.77
false
false
false
false
2.809769
0.086129
8.612925
0.608979
43.620099
0.305136
30.513595
0.365772
15.436242
0.470021
18.319271
0.511802
45.755762
false
false
2025-01-26
2025-01-27
1
Triangle104/Minerva-14b-V0.1 (Merge)
Triangle104_Minerva-7b_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Minerva-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Minerva-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Minerva-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Minerva-7b
00fd640525d0f802968a01336d0770e4243c9d5b
26.501051
apache-2.0
1
7.616
true
false
false
false
1.163702
0.37242
37.241962
0.54984
36.075865
0.283988
28.398792
0.322987
9.731544
0.414333
9.291667
0.444398
38.266475
true
false
2025-01-19
2025-01-27
1
Triangle104/Minerva-7b (Merge)
Triangle104_Minerva-8b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Minerva-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Minerva-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Minerva-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Minerva-8b
be3d362c6b6c491d780a774e3b8f18ba8a36c679
14.317747
apache-2.0
0
7.248
true
false
false
false
1.476809
0.172085
17.208451
0.466861
25.375307
0.004532
0.453172
0.312081
8.277405
0.427292
11.378125
0.308926
23.214022
true
false
2025-01-19
2025-01-27
1
Triangle104/Minerva-8b (Merge)
Triangle104_Mistral-Redemption-Arc_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Mistral-Redemption-Arc" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Mistral-Redemption-Arc</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Mistral-Redemption-Arc-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Mistral-Redemption-Arc
c4b74ad0240cd1c30eebf06f078483c9c5c576e5
32.78693
apache-2.0
1
23.572
true
false
false
false
1.593036
0.402894
40.289432
0.625488
46.276529
0.410121
41.012085
0.347315
12.975391
0.45951
17.172135
0.450964
38.996011
true
false
2025-02-07
2025-02-08
1
Triangle104/Mistral-Redemption-Arc (Merge)
Triangle104_Mistral-Small-24b-Harmony_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Mistral-Small-24b-Harmony" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Mistral-Small-24b-Harmony</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Mistral-Small-24b-Harmony-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Mistral-Small-24b-Harmony
2ef93029bdda2058b5ade068e20a76028bd0b87d
27.168346
apache-2.0
0
23.572
true
false
false
false
1.439958
0.168712
16.871235
0.643373
48.421151
0.191088
19.108761
0.384228
17.897092
0.427604
11.483854
0.543052
49.227985
true
false
2025-02-20
2025-02-20
1
Triangle104/Mistral-Small-24b-Harmony (Merge)