eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
64 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.09k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.22
0.83
BBH
float64
0.25
76.7
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.7
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
525 values
Submission Date
stringclasses
263 values
Generation
int64
0
10
Base Model
stringlengths
4
102
Triangle104_Pans_Gutenbergum_V0.1_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Pans_Gutenbergum_V0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Pans_Gutenbergum_V0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Pans_Gutenbergum_V0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Pans_Gutenbergum_V0.1
87017852514d3f12ced5e44a670ee00e0fd00124
22.276175
apache-2.0
3
12.248
true
false
false
false
1.684474
0.309696
30.969605
0.554109
36.082449
0.10574
10.574018
0.322987
9.731544
0.452813
16.334896
0.369681
29.964539
true
false
2024-10-27
2025-01-30
1
Triangle104/Pans_Gutenbergum_V0.1 (Merge)
Triangle104_Pans_Gutenbergum_V0.2_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Pans_Gutenbergum_V0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Pans_Gutenbergum_V0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Pans_Gutenbergum_V0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Pans_Gutenbergum_V0.2
d40fa1df61d99c02abc6d242d93be26a2a457aee
21.715318
1
12.248
false
false
false
false
1.681226
0.321511
32.151137
0.552579
35.914459
0.068731
6.873112
0.312081
8.277405
0.467323
18.348698
0.358544
28.727098
false
false
2024-12-14
2025-01-30
1
Triangle104/Pans_Gutenbergum_V0.2 (Merge)
Triangle104_Pantheon_ChatWaifu_V0.2_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Pantheon_ChatWaifu_V0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Pantheon_ChatWaifu_V0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Pantheon_ChatWaifu_V0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Pantheon_ChatWaifu_V0.2
27714543a8cbeedc8dba50651f40b5170ae65000
20.845639
1
12.248
false
false
false
false
1.774448
0.26828
26.828038
0.553157
36.743199
0.056647
5.664653
0.317953
9.060403
0.47551
19.638802
0.344249
27.138741
false
false
2024-10-27
2025-01-30
1
Triangle104/Pantheon_ChatWaifu_V0.2 (Merge)
Triangle104_Phi-4-AbliteratedRP_float16
float16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Phi-4-AbliteratedRP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Phi-4-AbliteratedRP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Phi-4-AbliteratedRP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Phi-4-AbliteratedRP
31fd2f1f6f0b95b5c695e9601344f24202e962c6
37.374976
apache-2.0
3
14.66
true
false
false
false
1.739477
0.492271
49.227051
0.670878
52.640969
0.307402
30.740181
0.395134
19.35123
0.509833
24.429167
0.530751
47.861259
true
false
2025-01-10
2025-01-27
1
Triangle104/Phi-4-AbliteratedRP (Merge)
Triangle104_Phi4-RP-o1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Phi4-RP-o1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Phi4-RP-o1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Phi4-RP-o1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Phi4-RP-o1
303b720420c30329ced0aca2a217cca97da405d1
28.809089
mit
0
14.66
true
false
false
false
1.863965
0.022007
2.200716
0.665256
51.593919
0.377644
37.76435
0.373322
16.442953
0.475573
19.179948
0.511054
45.672651
true
false
2025-01-30
2025-01-30
1
Triangle104/Phi4-RP-o1 (Merge)
Triangle104_Phi4-RP-o1-Ablit_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Phi4-RP-o1-Ablit" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Phi4-RP-o1-Ablit</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Phi4-RP-o1-Ablit-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Phi4-RP-o1-Ablit
28b977c7d255b9c504a63d7120b43c5638ac512a
28.677486
0
14.66
false
false
false
false
0.910974
0.023856
2.385559
0.662983
51.221841
0.388218
38.821752
0.363255
15.100671
0.475417
18.927083
0.510472
45.608008
false
false
2025-02-08
2025-02-09
1
Triangle104/Phi4-RP-o1-Ablit (Merge)
Triangle104_Porpoise-R1-Llama3.2-3b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Porpoise-R1-Llama3.2-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Porpoise-R1-Llama3.2-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Porpoise-R1-Llama3.2-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Porpoise-R1-Llama3.2-3b
dedf20a6549763640bc4c35cba354351cb6cce8c
13.626836
llama3.2
2
3.213
true
false
false
false
0.594629
0.435217
43.521745
0.382368
12.926571
0.042296
4.229607
0.266779
2.237136
0.357625
6.436458
0.211686
12.409501
true
false
2025-02-07
2025-02-08
1
Triangle104/Porpoise-R1-Llama3.2-3b (Merge)
Triangle104_Q2.5-14B-Instruct-1M-Harmony_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Q2.5-14B-Instruct-1M-Harmony" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Q2.5-14B-Instruct-1M-Harmony</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Q2.5-14B-Instruct-1M-Harmony-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Q2.5-14B-Instruct-1M-Harmony
82520f6d9a6a8f3d788a98e7009c22398985675d
37.738269
apache-2.0
1
14.77
true
false
false
false
3.773034
0.598633
59.863274
0.633881
47.259249
0.376888
37.688822
0.375
16.666667
0.479542
19.676042
0.50748
45.275561
true
false
2025-02-02
2025-02-02
1
Triangle104/Q2.5-14B-Instruct-1M-Harmony (Merge)
Triangle104_Q2.5-AthensCOT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Q2.5-AthensCOT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Q2.5-AthensCOT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Q2.5-AthensCOT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Q2.5-AthensCOT
6eaa16d09a4971ed9ad4371aada365caef30d38d
28.55805
apache-2.0
1
7.616
true
false
false
false
1.27705
0.457274
45.727448
0.554169
36.446692
0.291541
29.154079
0.300336
6.711409
0.457833
15.7625
0.437916
37.546173
true
false
2025-01-26
2025-01-27
1
Triangle104/Q2.5-AthensCOT (Merge)
Triangle104_Q2.5-CodeR1-3B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Q2.5-CodeR1-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Q2.5-CodeR1-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Q2.5-CodeR1-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Q2.5-CodeR1-3B
4ef3c55b383e21eedbe5e7094d12158a47c7f88e
19.810785
apache-2.0
1
3.085
true
false
false
false
0.732214
0.358756
35.875588
0.466084
25.312035
0.163897
16.389728
0.303691
7.158837
0.431542
12.142708
0.297872
21.985816
true
false
2025-02-24
2025-02-24
1
Triangle104/Q2.5-CodeR1-3B (Merge)
Triangle104_Q2.5-EVACOT-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Q2.5-EVACOT-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Q2.5-EVACOT-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Q2.5-EVACOT-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Q2.5-EVACOT-7b
ed18ec729937b992ee9c9e0992d87d137b604f9c
30.447241
apache-2.0
1
7.616
true
false
false
false
1.309828
0.578424
57.842414
0.550552
35.722593
0.282477
28.247734
0.317953
9.060403
0.449865
14.79974
0.433095
37.010564
true
false
2025-01-26
2025-01-27
1
Triangle104/Q2.5-EVACOT-7b (Merge)
Triangle104_Q2.5-EvaHumane-RP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Q2.5-EvaHumane-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Q2.5-EvaHumane-RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Q2.5-EvaHumane-RP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Q2.5-EvaHumane-RP
e303ea41606c02d5f434e88bddde43ba96586891
26.370823
2
7.616
false
false
false
false
1.257423
0.367623
36.762346
0.53282
33.757406
0.292296
29.229607
0.318792
9.17226
0.427635
11.38776
0.44124
37.915559
false
false
2025-01-18
2025-01-27
1
Triangle104/Q2.5-EvaHumane-RP (Merge)
Triangle104_Q2.5-Humane-RP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Q2.5-Humane-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Q2.5-Humane-RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Q2.5-Humane-RP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Q2.5-Humane-RP
819395b7cc4fc157d8f322432580a5c3defebd63
29.83188
apache-2.0
2
7.616
true
false
false
false
1.254657
0.441163
44.116278
0.564929
37.653375
0.339124
33.912387
0.318792
9.17226
0.452813
15.334896
0.449219
38.802083
true
false
2025-01-12
2025-01-27
1
Triangle104/Q2.5-Humane-RP (Merge)
Triangle104_Q2.5-Instruct-1M_Harmony_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Q2.5-Instruct-1M_Harmony" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Q2.5-Instruct-1M_Harmony</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Q2.5-Instruct-1M_Harmony-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Q2.5-Instruct-1M_Harmony
efda8174f8b2134ca9ee1490f5bb73917845e24c
32.078676
apache-2.0
1
7.616
true
false
false
false
1.275663
0.603803
60.380346
0.537324
33.631462
0.332326
33.232628
0.322987
9.731544
0.468781
18.097656
0.436586
37.398419
true
false
2025-01-29
2025-01-29
1
Triangle104/Q2.5-Instruct-1M_Harmony (Merge)
Triangle104_Q2.5-R1-3B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Q2.5-R1-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Q2.5-R1-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Q2.5-R1-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Q2.5-R1-3B
cd4b8cb325e4e126f62e1279fd2281167b9228ef
24.66767
apache-2.0
0
3.085
true
false
false
false
0.787777
0.421354
42.135423
0.481243
27.203483
0.267372
26.73716
0.309564
7.941834
0.431979
12.730729
0.381316
31.257388
true
false
2025-02-24
2025-02-24
1
Triangle104/Q2.5-R1-3B (Merge)
Triangle104_Q2.5-R1-7B_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Q2.5-R1-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Q2.5-R1-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Q2.5-R1-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Q2.5-R1-7B
2579a9c6b7c04b92d07b00d467eb56708e971a98
3.783468
apache-2.0
1
7.613
true
false
false
false
0.700542
0.134615
13.461504
0.300656
2.548887
0.016616
1.661631
0.252517
0.33557
0.360729
2.691146
0.118019
2.002069
true
false
2025-02-24
2025-02-27
1
Triangle104/Q2.5-R1-7B (Merge)
Triangle104_Robo-Gutenberg_V1.0_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Robo-Gutenberg_V1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Robo-Gutenberg_V1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Robo-Gutenberg_V1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Robo-Gutenberg_V1.0
9162806dd52c7cf1b7bb9798cc4176460859ce88
40.348594
apache-2.0
1
14.77
true
false
false
false
3.889838
0.600756
60.075599
0.653717
50.286291
0.456193
45.619335
0.385906
18.120805
0.474365
19.195573
0.539146
48.793957
true
false
2024-12-05
2025-01-30
1
Triangle104/Robo-Gutenberg_V1.0 (Merge)
Triangle104_Rocinante-Prism_V2.0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Rocinante-Prism_V2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Rocinante-Prism_V2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Rocinante-Prism_V2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Rocinante-Prism_V2.0
1d6764b9feeba5794bfa1f89ab931648ad9dc3fa
20.774919
1
12.248
false
false
false
false
1.649364
0.26161
26.161031
0.536125
33.228206
0.111027
11.102719
0.32047
9.395973
0.445
15.425
0.364029
29.336584
false
false
2024-12-02
2025-01-30
1
Triangle104/Rocinante-Prism_V2.0 (Merge)
Triangle104_Rocinante-Prism_V2.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Rocinante-Prism_V2.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Rocinante-Prism_V2.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Rocinante-Prism_V2.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Rocinante-Prism_V2.1
775836a90e03521f2f0a8850ba4862812678299f
20.791469
0
12.248
false
false
false
false
1.70315
0.25584
25.584006
0.533268
32.982523
0.112538
11.253776
0.319631
9.284116
0.448969
16.18776
0.36511
29.456634
false
false
2025-01-30
0
Removed
Triangle104_RomboHermes3-R1-Llama3.2-3b_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/RomboHermes3-R1-Llama3.2-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/RomboHermes3-R1-Llama3.2-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__RomboHermes3-R1-Llama3.2-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/RomboHermes3-R1-Llama3.2-3b
0725171bdc64ac36851ae22eb6e5e8242fb9b8c6
14.658281
llama3.2
2
3.213
true
false
false
false
0.595918
0.300729
30.072873
0.426395
19.092696
0.081571
8.1571
0.283557
4.474273
0.365656
4.407031
0.295711
21.745715
true
false
2025-02-09
2025-02-09
1
Triangle104/RomboHermes3-R1-Llama3.2-3b (Merge)
Triangle104_Rombos-Novasky-7B_V1c_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Rombos-Novasky-7B_V1c" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Rombos-Novasky-7B_V1c</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Rombos-Novasky-7B_V1c-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Rombos-Novasky-7B_V1c
77fb1ba91d47b6caf9bbfb9c6f8d7cb591889884
18.209818
apache-2.0
0
7.616
true
false
false
false
0.672077
0.408015
40.801518
0.434925
20.422125
0.085347
8.534743
0.296141
6.152125
0.446458
14.040625
0.27377
19.307772
true
false
2025-02-18
2025-02-20
1
Triangle104/Rombos-Novasky-7B_V1c (Merge)
Triangle104_Set-70b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Triangle104/Set-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Triangle104/Set-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Triangle104__Set-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Triangle104/Set-70b
206b32829e4fcb65306bbb60370735f97cef6e4d
44.034692
llama3.3
0
70.554
true
false
false
false
28.620896
0.764295
76.42954
0.701429
56.880031
0.364048
36.404834
0.446309
26.174497
0.469563
18.961979
0.544215
49.35727
true
false
2025-01-19
2025-01-27
1
Triangle104/Set-70b (Merge)
Tsunami-th_Tsunami-0.5-7B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Tsunami-th/Tsunami-0.5-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Tsunami-th/Tsunami-0.5-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Tsunami-th__Tsunami-0.5-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Tsunami-th/Tsunami-0.5-7B-Instruct
10706336513d54c4e8962f54653f25941c4031f4
36.427097
apache-2.0
0
7.616
true
false
false
true
2.180106
0.740015
74.001538
0.552369
36.138254
0.504532
50.453172
0.308725
7.829978
0.425719
12.214844
0.441323
37.924793
false
false
2024-10-11
2024-10-12
1
Tsunami-th/Tsunami-0.5-7B-Instruct (Merge)
Tsunami-th_Tsunami-0.5x-7B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Tsunami-th/Tsunami-0.5x-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Tsunami-th/Tsunami-0.5x-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Tsunami-th__Tsunami-0.5x-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Tsunami-th/Tsunami-0.5x-7B-Instruct
83d048ab565893a660fa7eaeb4a749d360c76b53
36.004747
apache-2.0
1
7.616
true
false
false
true
2.117127
0.709915
70.991525
0.559287
37.363061
0.420695
42.069486
0.314597
8.612975
0.466677
18.567969
0.445811
38.423463
false
false
2024-10-15
2024-10-16
1
Tsunami-th/Tsunami-0.5x-7B-Instruct (Merge)
Tsunami-th_Tsunami-1.0-14B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Tsunami-th/Tsunami-1.0-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Tsunami-th/Tsunami-1.0-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Tsunami-th__Tsunami-1.0-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Tsunami-th/Tsunami-1.0-14B-Instruct
b468814b5242acbe6294226db71bc19dead6c8b6
41.840045
apache-2.0
0
14.77
true
false
false
true
3.307263
0.782905
78.290491
0.643876
49.150255
0.458459
45.845921
0.356544
14.205817
0.445938
16.342187
0.52485
47.2056
false
false
2024-10-25
2024-10-25
1
Tsunami-th/Tsunami-1.0-14B-Instruct (Merge)
Tsunami-th_Tsunami-1.0-7B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Tsunami-th/Tsunami-1.0-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Tsunami-th/Tsunami-1.0-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Tsunami-th__Tsunami-1.0-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Tsunami-th/Tsunami-1.0-7B-Instruct
34d0f8da8ce6b0de50a269eef622ff2e93e5c059
35.748713
apache-2.0
1
7.616
true
false
false
true
2.968437
0.730873
73.087297
0.549071
35.857243
0.433535
43.353474
0.312919
8.389262
0.449281
15.760156
0.442404
38.044843
false
false
2024-10-28
2024-10-28
1
Tsunami-th/Tsunami-1.0-7B-Instruct (Merge)
UCLA-AGI_Gemma-2-9B-It-SPPO-Iter1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/Gemma-2-9B-It-SPPO-Iter1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/Gemma-2-9B-It-SPPO-Iter1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/UCLA-AGI__Gemma-2-9B-It-SPPO-Iter1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/Gemma-2-9B-It-SPPO-Iter1
33cfd6919f22efc38f71e9d21a7e697afb418e6b
22.586155
gemma
3
9.242
true
false
false
true
5.885316
0.308221
30.822108
0.596893
41.80923
0.089879
8.987915
0.336409
11.521253
0.409938
10.075521
0.390708
32.300901
false
false
2024-06-29
2024-09-21
0
UCLA-AGI/Gemma-2-9B-It-SPPO-Iter1
UCLA-AGI_Gemma-2-9B-It-SPPO-Iter2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/Gemma-2-9B-It-SPPO-Iter2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/Gemma-2-9B-It-SPPO-Iter2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/UCLA-AGI__Gemma-2-9B-It-SPPO-Iter2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/Gemma-2-9B-It-SPPO-Iter2
b7590721d92bf6e0606e3dbc1ca2c229b7c534b4
22.563073
gemma
3
9.242
true
false
false
true
5.432924
0.31002
31.001964
0.598988
42.169834
0.080816
8.081571
0.334732
11.297539
0.413938
10.942188
0.386968
31.885343
false
false
2024-06-29
2024-08-07
0
UCLA-AGI/Gemma-2-9B-It-SPPO-Iter2
UCLA-AGI_Gemma-2-9B-It-SPPO-Iter3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/Gemma-2-9B-It-SPPO-Iter3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/Gemma-2-9B-It-SPPO-Iter3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/UCLA-AGI__Gemma-2-9B-It-SPPO-Iter3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/Gemma-2-9B-It-SPPO-Iter3
2261f2a03b2e15de13a18da52590c237ecf5f188
22.650463
gemma
122
9.242
true
false
false
true
5.6303
0.316714
31.67141
0.600708
42.536752
0.070997
7.099698
0.338926
11.856823
0.416604
11.342188
0.382563
31.395907
false
false
2024-06-29
2024-07-31
0
UCLA-AGI/Gemma-2-9B-It-SPPO-Iter3
UCLA-AGI_Llama-3-Instruct-8B-SPPO-Iter1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/UCLA-AGI__Llama-3-Instruct-8B-SPPO-Iter1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter1
2076437f65776aeb9686c95f1f41515f70c4db27
24.765958
apache-2.0
1
8.03
true
false
false
true
1.402458
0.729899
72.989889
0.505789
29.489353
0.114804
11.480363
0.267617
2.348993
0.356792
2.165625
0.371094
30.121528
false
false
2024-06-25
2024-09-21
0
UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter1
UCLA-AGI_Llama-3-Instruct-8B-SPPO-Iter2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/UCLA-AGI__Llama-3-Instruct-8B-SPPO-Iter2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter2
730c7207d4b538feeb3c2e6d6f6a6ba8615a9be3
24.040943
apache-2.0
0
8
true
false
false
true
1.313533
0.698875
69.887454
0.50887
29.869449
0.103474
10.347432
0.266779
2.237136
0.359427
1.995052
0.369182
29.909131
false
false
2024-06-25
2024-08-07
0
UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter2
UCLA-AGI_Llama-3-Instruct-8B-SPPO-Iter3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/UCLA-AGI__Llama-3-Instruct-8B-SPPO-Iter3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter3
f73dafc2923acd56f115f21f76e9d14f8d19a63e
23.693396
apache-2.0
82
8.03
true
false
false
true
9.135299
0.683412
68.341224
0.507958
29.739684
0.095921
9.592145
0.265101
2.013423
0.366062
3.091146
0.364445
29.382757
false
false
2024-06-25
2024-07-02
0
UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter3
UCLA-AGI_Llama-3-Instruct-8B-SPPO-Iter3_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/UCLA-AGI__Llama-3-Instruct-8B-SPPO-Iter3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter3
f73dafc2923acd56f115f21f76e9d14f8d19a63e
23.05947
apache-2.0
82
8.03
true
false
false
true
0.910475
0.670298
67.029814
0.507641
29.716701
0.071752
7.175227
0.265101
2.013423
0.364729
2.891146
0.365775
29.530511
false
false
2024-06-25
2024-06-28
0
UCLA-AGI/Llama-3-Instruct-8B-SPPO-Iter3
UCLA-AGI_Mistral7B-PairRM-SPPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/Mistral7B-PairRM-SPPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/Mistral7B-PairRM-SPPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/UCLA-AGI__Mistral7B-PairRM-SPPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/Mistral7B-PairRM-SPPO
abdc173603690fcf6b333b351c291a321d2631c3
16.444697
apache-2.0
6
7.242
true
false
false
true
1.008318
0.435492
43.549227
0.443898
22.084656
0.030967
3.096677
0.28104
4.138702
0.396479
7.793229
0.262051
18.005689
false
false
2024-05-04
2024-09-21
0
UCLA-AGI/Mistral7B-PairRM-SPPO
UCLA-AGI_Mistral7B-PairRM-SPPO-Iter1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/Mistral7B-PairRM-SPPO-Iter1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/Mistral7B-PairRM-SPPO-Iter1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/UCLA-AGI__Mistral7B-PairRM-SPPO-Iter1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/Mistral7B-PairRM-SPPO-Iter1
97252e2d868725b2fa5055adc241c5182610fb6a
17.917746
apache-2.0
2
7.242
true
false
false
true
1.053347
0.504735
50.473521
0.446806
22.932292
0.024924
2.492447
0.283557
4.474273
0.399177
8.297135
0.269531
18.836806
false
false
2024-05-04
2024-09-21
0
UCLA-AGI/Mistral7B-PairRM-SPPO-Iter1
UCLA-AGI_Mistral7B-PairRM-SPPO-Iter2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/Mistral7B-PairRM-SPPO-Iter2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/Mistral7B-PairRM-SPPO-Iter2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/UCLA-AGI__Mistral7B-PairRM-SPPO-Iter2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/Mistral7B-PairRM-SPPO-Iter2
8201064df67b5762ff9f361ff1b98aae3747855c
17.11814
apache-2.0
1
7.242
true
false
false
true
1.030969
0.444585
44.458481
0.446572
22.479924
0.021903
2.190332
0.288591
5.145414
0.408542
9.801042
0.267703
18.633644
false
false
2024-05-04
2024-08-07
0
UCLA-AGI/Mistral7B-PairRM-SPPO-Iter2
UCLA-AGI_Mistral7B-PairRM-SPPO-Iter3_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/Mistral7B-PairRM-SPPO-Iter3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/Mistral7B-PairRM-SPPO-Iter3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/UCLA-AGI__Mistral7B-PairRM-SPPO-Iter3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/Mistral7B-PairRM-SPPO-Iter3
72cd8e5435ae679249ddad7ac4cdb64c5b4590c3
16.488657
apache-2.0
5
7.242
true
false
false
true
1.036815
0.435068
43.506784
0.439659
21.817496
0.023414
2.34139
0.275168
3.355705
0.407115
9.489323
0.265791
18.421247
false
false
2024-05-04
2024-08-07
0
UCLA-AGI/Mistral7B-PairRM-SPPO-Iter3
UKzExecution_LlamaExecutor-8B-3.0.5_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/UKzExecution/LlamaExecutor-8B-3.0.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UKzExecution/LlamaExecutor-8B-3.0.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/UKzExecution__LlamaExecutor-8B-3.0.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UKzExecution/LlamaExecutor-8B-3.0.5
2047978e8ab1146b8881cde3d998856594f437a4
24.541079
0
8.03
false
false
false
true
1.614004
0.74029
74.029021
0.5006
28.413815
0.101964
10.196375
0.255872
0.782998
0.375365
4.653906
0.362533
29.170361
false
false
2024-07-29
2024-07-30
1
UKzExecution/LlamaExecutor-8B-3.0.5 (Merge)
Unbabel_TowerInstruct-Mistral-7B-v0.2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Unbabel/TowerInstruct-Mistral-7B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Unbabel/TowerInstruct-Mistral-7B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Unbabel__TowerInstruct-Mistral-7B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Unbabel/TowerInstruct-Mistral-7B-v0.2
454bdfedc8b51f292a402aba2c560df145a0817d
11.902717
cc-by-nc-4.0
16
7.242
true
false
false
false
1.207779
0.284342
28.434221
0.388195
14.224326
0.020393
2.039275
0.247483
0
0.452229
15.961979
0.196809
10.756501
false
false
2024-03-26
2024-09-06
0
Unbabel/TowerInstruct-Mistral-7B-v0.2
Undi95_MG-FinalMix-72B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/MG-FinalMix-72B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/MG-FinalMix-72B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Undi95__MG-FinalMix-72B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/MG-FinalMix-72B
6c9c2f5d052495dcd49f44bf5623d21210653c65
44.297362
other
5
72.706
true
false
false
true
24.444047
0.801365
80.136482
0.697302
57.502412
0.397281
39.728097
0.385067
18.008949
0.482271
21.217187
0.542719
49.191046
true
false
2024-06-25
2024-07-13
1
Undi95/MG-FinalMix-72B (Merge)
Undi95_Phi4-abliterated_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/Phi4-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/Phi4-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Undi95__Phi4-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/Phi4-abliterated
b960b130911eeb32fd728043652b9f9591821469
37.422371
11
14.66
true
false
false
true
1.910049
0.661755
66.175525
0.680902
54.117248
0.370091
37.009063
0.330537
10.738255
0.403427
8.928385
0.528092
47.565751
false
false
2025-01-09
2025-01-23
0
Undi95/Phi4-abliterated
V3N0M_Jenna-Tiny-2.0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/V3N0M/Jenna-Tiny-2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">V3N0M/Jenna-Tiny-2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/V3N0M__Jenna-Tiny-2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
V3N0M/Jenna-Tiny-2.0
95c5e775b4a155110a5fac3e1cdd814dde93f220
5.519126
0
0.631
false
false
false
false
0.4936
0.230936
23.093614
0.314793
4.83
0.012085
1.208459
0.25
0
0.336667
2.35
0.114694
1.632683
false
false
2024-06-18
2025-01-17
0
V3N0M/Jenna-Tiny-2.0
VAGOsolutions_Llama-3-SauerkrautLM-70b-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/Llama-3-SauerkrautLM-70b-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/Llama-3-SauerkrautLM-70b-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__Llama-3-SauerkrautLM-70b-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/Llama-3-SauerkrautLM-70b-Instruct
707cfd1a93875247c0223e0c7e3d86d58c432318
38.005588
other
23
70.554
true
false
false
true
21.252382
0.804462
80.446216
0.666325
52.02958
0.228097
22.809668
0.32802
10.402685
0.433938
13.542188
0.539229
48.803191
false
false
2024-04-24
2024-06-26
0
VAGOsolutions/Llama-3-SauerkrautLM-70b-Instruct
VAGOsolutions_Llama-3-SauerkrautLM-8b-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/Llama-3-SauerkrautLM-8b-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/Llama-3-SauerkrautLM-8b-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__Llama-3-SauerkrautLM-8b-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/Llama-3-SauerkrautLM-8b-Instruct
37127c44d7c0fb56cef817270c4b1a6802d8793a
26.667655
other
53
8.03
true
false
false
true
1.591387
0.744537
74.453672
0.494338
28.049242
0.066465
6.646526
0.308725
7.829978
0.424104
11.279688
0.385721
31.746823
false
false
2024-04-19
2024-07-22
0
VAGOsolutions/Llama-3-SauerkrautLM-8b-Instruct
VAGOsolutions_Llama-3.1-SauerkrautLM-70b-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/Llama-3.1-SauerkrautLM-70b-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/Llama-3.1-SauerkrautLM-70b-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__Llama-3.1-SauerkrautLM-70b-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/Llama-3.1-SauerkrautLM-70b-Instruct
e8e74aa789243c25a3a8f7565780a402f5050bbb
43.41377
llama3.1
21
70.554
true
false
false
true
30.183235
0.865637
86.563651
0.700625
57.241621
0.369335
36.933535
0.341443
12.192394
0.471083
19.385417
0.533494
48.166002
false
false
2024-07-29
2024-08-26
0
VAGOsolutions/Llama-3.1-SauerkrautLM-70b-Instruct
VAGOsolutions_Llama-3.1-SauerkrautLM-8b-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/Llama-3.1-SauerkrautLM-8b-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/Llama-3.1-SauerkrautLM-8b-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__Llama-3.1-SauerkrautLM-8b-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/Llama-3.1-SauerkrautLM-8b-Instruct
23ca79966a4ab0a61f7ccc7a0454ffef553b66eb
29.931073
llama3.1
32
8.03
true
false
false
true
2.498872
0.801739
80.173938
0.511493
30.999361
0.194109
19.410876
0.290268
5.369128
0.414802
11.516927
0.389046
32.116209
false
false
2024-07-25
2024-07-29
0
VAGOsolutions/Llama-3.1-SauerkrautLM-8b-Instruct
VAGOsolutions_SauerkrautLM-1.5b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-1.5b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-1.5b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__SauerkrautLM-1.5b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-1.5b
8f5170f03e6b0355dd920adc3a7e65d0417ee14e
10.273563
apache-2.0
11
1.544
true
false
false
true
1.552474
0.240403
24.040324
0.370391
13.419518
0.036254
3.625378
0.270973
2.796421
0.373906
4.971615
0.215093
12.788121
false
false
2024-06-12
2024-06-26
0
VAGOsolutions/SauerkrautLM-1.5b
VAGOsolutions_SauerkrautLM-7b-HerO_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-7b-HerO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-7b-HerO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__SauerkrautLM-7b-HerO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-7b-HerO
3a14b437e2f375b74de3b6923e171662133347bb
19.669312
apache-2.0
32
7.242
true
false
false
true
1.139537
0.53461
53.461039
0.490443
27.991874
0.039275
3.927492
0.272651
3.020134
0.392385
6.88151
0.304604
22.733821
true
false
2023-11-24
2024-06-26
0
VAGOsolutions/SauerkrautLM-7b-HerO
VAGOsolutions_SauerkrautLM-7b-LaserChat_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-7b-LaserChat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-7b-LaserChat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__SauerkrautLM-7b-LaserChat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-7b-LaserChat
cb759636a3d5b0768df2f43a3d3da9b17e10e7b9
22.147316
apache-2.0
12
7.242
true
false
false
true
1.209358
0.598782
59.878234
0.454327
22.99208
0.077795
7.779456
0.300336
6.711409
0.414802
9.916927
0.330452
25.605792
false
false
2024-02-05
2024-06-26
0
VAGOsolutions/SauerkrautLM-7b-LaserChat
VAGOsolutions_SauerkrautLM-Gemma-2b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-Gemma-2b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-Gemma-2b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__SauerkrautLM-Gemma-2b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-Gemma-2b
f9d5575c23da96f33ce77dea3b0776746b9469bc
7.716095
other
8
2.506
true
false
false
true
1.832471
0.247522
24.752213
0.341632
9.13387
0.027946
2.794562
0.256711
0.894855
0.367583
3.514583
0.146858
5.206486
false
false
2024-03-06
2024-06-26
0
VAGOsolutions/SauerkrautLM-Gemma-2b
VAGOsolutions_SauerkrautLM-Gemma-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-Gemma-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-Gemma-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__SauerkrautLM-Gemma-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-Gemma-7b
4296bdabf82e900235b094e5348be03ebb0ec891
14.801979
other
13
8.538
true
false
false
true
3.026499
0.340671
34.067053
0.418791
18.492652
0.067221
6.722054
0.286074
4.809843
0.359427
2.928385
0.296127
21.791888
false
false
2024-02-27
2024-06-26
0
VAGOsolutions/SauerkrautLM-Gemma-7b
VAGOsolutions_SauerkrautLM-Mixtral-8x7B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__SauerkrautLM-Mixtral-8x7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct
30ed549de7d84f68b4c6cb619f73275c99af23cc
24.487467
apache-2.0
22
46.703
true
true
false
true
7.54878
0.560189
56.018919
0.527734
33.945163
0.098187
9.818731
0.297819
6.375839
0.420417
11.31875
0.365027
29.4474
false
false
2023-12-15
2024-06-26
0
VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct
VAGOsolutions_SauerkrautLM-Nemo-12b-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-Nemo-12b-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-Nemo-12b-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__SauerkrautLM-Nemo-12b-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-Nemo-12b-Instruct
fcb056465084ab2c71503a0760f46e4be79c985c
26.219082
apache-2.0
22
12.248
true
false
false
true
2.761055
0.611297
61.129691
0.521413
32.343783
0.122356
12.23565
0.309564
7.941834
0.446896
17.161979
0.338514
26.501551
false
false
2024-07-22
2024-07-22
0
VAGOsolutions/SauerkrautLM-Nemo-12b-Instruct
VAGOsolutions_SauerkrautLM-Phi-3-medium_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-Phi-3-medium" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-Phi-3-medium</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__SauerkrautLM-Phi-3-medium-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-Phi-3-medium
ebfed26a2b35ede15fe526f57029e0ad866ac66d
30.407915
mit
9
13.96
true
false
false
false
1.565495
0.440888
44.088796
0.643293
49.63035
0.160121
16.012085
0.334732
11.297539
0.4845
20.695833
0.466506
40.722887
false
false
2024-06-09
2024-09-19
0
VAGOsolutions/SauerkrautLM-Phi-3-medium
VAGOsolutions_SauerkrautLM-SOLAR-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-SOLAR-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-SOLAR-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__SauerkrautLM-SOLAR-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-SOLAR-Instruct
2665d7600ccd253728453433d2434844e6f702bd
21.221646
cc-by-nc-4.0
46
10.732
true
false
false
true
1.631461
0.491721
49.172086
0.516945
31.83892
0.063444
6.344411
0.305369
7.38255
0.396542
8.334375
0.318318
24.257535
false
false
2023-12-20
2024-06-26
0
VAGOsolutions/SauerkrautLM-SOLAR-Instruct
VAGOsolutions_SauerkrautLM-gemma-2-2b-it_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-gemma-2-2b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-gemma-2-2b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__SauerkrautLM-gemma-2-2b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-gemma-2-2b-it
7fd35fcb32aebfc422e535739161d7528fc562d5
10.817669
gemma
10
2.614
true
false
false
true
4.744973
0.132066
13.206625
0.424084
18.914195
0.021903
2.190332
0.272651
3.020134
0.399458
8.765625
0.269282
18.809102
false
false
2024-08-03
2024-08-26
0
VAGOsolutions/SauerkrautLM-gemma-2-2b-it
VAGOsolutions_SauerkrautLM-gemma-2-9b-it_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-gemma-2-9b-it" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-gemma-2-9b-it</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__SauerkrautLM-gemma-2-9b-it-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-gemma-2-9b-it
8e02fc1c24e0499c74ee1186ddc46b989fe497f1
23.141814
gemma
7
9.242
true
false
false
true
5.812136
0.302401
30.240096
0.607265
43.249989
0.083837
8.383686
0.327181
10.290828
0.431823
12.344531
0.409076
34.341755
false
false
2024-08-12
2024-08-26
0
VAGOsolutions/SauerkrautLM-gemma-2-9b-it
VAGOsolutions_SauerkrautLM-v2-14b-DPO_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-v2-14b-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-v2-14b-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__SauerkrautLM-v2-14b-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-v2-14b-DPO
1fbe5364bc443255a06df7fa0debbcc3d38ab866
37.583892
apache-2.0
19
14.77
true
false
false
true
2.983251
0.741165
74.116455
0.656037
50.926132
0.316465
31.646526
0.319631
9.284116
0.437469
13.783594
0.511719
45.746528
false
false
2024-10-31
2024-11-04
1
VAGOsolutions/SauerkrautLM-v2-14b-DPO (Merge)
VAGOsolutions_SauerkrautLM-v2-14b-SFT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-v2-14b-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-v2-14b-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VAGOsolutions__SauerkrautLM-v2-14b-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-v2-14b-SFT
606ddc7819d4a5d9cd8618d5ede57e2bdd99a1ed
36.227856
apache-2.0
8
14.77
true
false
false
true
3.037849
0.694853
69.485299
0.621036
45.824351
0.32855
32.854985
0.33557
11.409396
0.417875
11.067708
0.520529
46.725399
false
false
2024-10-25
2024-11-04
1
VAGOsolutions/SauerkrautLM-v2-14b-SFT (Merge)
VIRNECT_llama-3-Korean-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/VIRNECT/llama-3-Korean-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VIRNECT/llama-3-Korean-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VIRNECT__llama-3-Korean-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VIRNECT/llama-3-Korean-8B
c658409e094ff04eeb6ab6cee2d4bc56716e45f1
20.245301
llama3
0
8.03
true
false
false
true
0.812687
0.505835
50.583452
0.490825
27.322412
0.0929
9.29003
0.270973
2.796421
0.366156
3.269531
0.35389
28.209959
false
false
2024-07-17
2024-07-17
0
VIRNECT/llama-3-Korean-8B
VIRNECT_llama-3-Korean-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/VIRNECT/llama-3-Korean-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VIRNECT/llama-3-Korean-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VIRNECT__llama-3-Korean-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VIRNECT/llama-3-Korean-8B
c658409e094ff04eeb6ab6cee2d4bc56716e45f1
20.431609
llama3
0
8.03
true
false
false
true
1.640613
0.502138
50.213766
0.491838
27.564319
0.108006
10.800604
0.270973
2.796421
0.364792
3.032292
0.35364
28.182255
false
false
2024-07-17
2024-07-17
0
VIRNECT/llama-3-Korean-8B
VIRNECT_llama-3-Korean-8B-r-v-0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/VIRNECT/llama-3-Korean-8B-r-v-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VIRNECT/llama-3-Korean-8B-r-v-0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/VIRNECT__llama-3-Korean-8B-r-v-0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VIRNECT/llama-3-Korean-8B-r-v-0.1
10acb1aa4f341f2d3c899d78c520b0822a909b95
18.749279
llama3
0
16.061
true
false
false
true
2.398982
0.491571
49.157125
0.480616
25.884954
0.086103
8.610272
0.24245
0
0.36749
3.736198
0.325964
25.107122
false
false
2024-07-18
2024-07-18
2
MLP-KTLim/llama-3-Korean-Bllossom-8B (Merge)
ValiantLabs_Llama3-70B-Fireplace_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3-70B-Fireplace" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3-70B-Fireplace</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3-70B-Fireplace-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3-70B-Fireplace
220079e4115733991eb19c30d5480db9696a665e
37.125227
llama3
3
70.554
true
false
false
true
19.384344
0.77736
77.735963
0.648899
49.55653
0.214502
21.450151
0.354866
13.982103
0.444854
16.773438
0.489279
43.253177
false
false
2024-05-09
2024-06-26
0
ValiantLabs/Llama3-70B-Fireplace
ValiantLabs_Llama3-70B-ShiningValiant2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3-70B-ShiningValiant2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3-70B-ShiningValiant2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3-70B-ShiningValiant2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3-70B-ShiningValiant2
bd6cce8da08ccefe9ec58cae3df4bf75c97d8950
32.730483
llama3
5
70.554
true
false
false
true
22.435187
0.612171
61.217126
0.633834
46.710261
0.207704
20.770393
0.330537
10.738255
0.432573
13.638281
0.489777
43.308585
false
false
2024-04-20
2024-07-25
0
ValiantLabs/Llama3-70B-ShiningValiant2
ValiantLabs_Llama3.1-70B-ShiningValiant2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.1-70B-ShiningValiant2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.1-70B-ShiningValiant2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.1-70B-ShiningValiant2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.1-70B-ShiningValiant2
55436621ed65f0b79e7c6324b780bd6a18e06c79
36.493184
llama3.1
3
70.554
true
false
false
false
27.99457
0.535535
53.55346
0.673841
52.390969
0.291541
29.154079
0.392617
19.01566
0.468104
18.479688
0.517287
46.365248
false
false
2024-10-30
2024-10-30
2
meta-llama/Meta-Llama-3.1-70B
ValiantLabs_Llama3.1-8B-Cobalt_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.1-8B-Cobalt" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.1-8B-Cobalt</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.1-8B-Cobalt-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.1-8B-Cobalt
3a69145a2acc1f7f51735aa3ae5d81c090249c65
20.239394
llama3.1
6
8.03
true
false
false
false
2.627763
0.349613
34.961347
0.494677
27.417777
0.126888
12.688822
0.303691
7.158837
0.395948
9.826823
0.364445
29.382757
false
false
2024-08-16
2024-10-02
2
meta-llama/Meta-Llama-3.1-8B
ValiantLabs_Llama3.1-8B-Cobalt_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.1-8B-Cobalt" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.1-8B-Cobalt</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.1-8B-Cobalt-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.1-8B-Cobalt
3a69145a2acc1f7f51735aa3ae5d81c090249c65
25.558664
llama3.1
6
8.03
true
false
false
true
0.938171
0.716835
71.683467
0.49107
27.235483
0.153323
15.332326
0.286074
4.809843
0.35124
4.704948
0.366273
29.585919
false
false
2024-08-16
2024-09-20
2
meta-llama/Meta-Llama-3.1-8B
ValiantLabs_Llama3.1-8B-Enigma_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.1-8B-Enigma" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.1-8B-Enigma</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.1-8B-Enigma-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.1-8B-Enigma
332c99d80f378c77b090745a5aac10f8ab339519
16.625157
llama3.1
10
8.03
true
false
false
false
7.275141
0.268055
26.805543
0.44776
22.012915
0.089124
8.912387
0.287752
5.033557
0.419604
10.217188
0.340924
26.769356
false
false
2024-08-11
2024-10-02
2
meta-llama/Meta-Llama-3.1-8B
ValiantLabs_Llama3.1-8B-Esper2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.1-8B-Esper2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.1-8B-Esper2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.1-8B-Esper2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.1-8B-Esper2
38f24f2fe90f839acbc57e7530221acf1232e9dc
13.94081
llama3.1
2
8.03
true
false
false
false
1.75353
0.25674
25.673989
0.446987
22.195685
0.058912
5.891239
0.272651
3.020134
0.356073
5.709115
0.290392
21.154699
false
false
2024-10-02
2024-10-09
2
meta-llama/Meta-Llama-3.1-8B
ValiantLabs_Llama3.1-8B-Fireplace2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.1-8B-Fireplace2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.1-8B-Fireplace2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.1-8B-Fireplace2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.1-8B-Fireplace2
be3a5c18b5e8e86a3703df1a8227f784ad2c713c
18.312602
llama3.1
6
8.03
true
false
false
true
0.916234
0.548324
54.8324
0.460982
24.070273
0.058157
5.81571
0.288591
5.145414
0.343302
4.379427
0.240691
15.632388
false
false
2024-07-23
2024-07-25
2
meta-llama/Meta-Llama-3.1-8B
ValiantLabs_Llama3.1-8B-Fireplace2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.1-8B-Fireplace2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.1-8B-Fireplace2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.1-8B-Fireplace2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.1-8B-Fireplace2
ef129903bbdcc59efdbe10fe9061bff473334a99
18.570581
llama3.1
6
8.03
true
false
false
true
1.802195
0.532812
53.281183
0.461331
24.089954
0.087613
8.761329
0.28943
5.257271
0.336667
4.216667
0.242354
15.81708
false
false
2024-07-23
2024-08-10
2
meta-llama/Meta-Llama-3.1-8B
ValiantLabs_Llama3.1-8B-ShiningValiant2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.1-8B-ShiningValiant2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.1-8B-ShiningValiant2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.1-8B-ShiningValiant2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.1-8B-ShiningValiant2
6b2b5694a192cb29ad0e4314138affa25b630c0e
23.157281
llama3.1
16
8.03
true
false
false
true
2.445238
0.649565
64.956538
0.477391
26.346119
0.056647
5.664653
0.310403
8.053691
0.390865
7.458073
0.338182
26.464613
false
false
2024-08-06
2024-08-10
2
meta-llama/Meta-Llama-3.1-8B
ValiantLabs_Llama3.1-8B-ShiningValiant2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.1-8B-ShiningValiant2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.1-8B-ShiningValiant2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.1-8B-ShiningValiant2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.1-8B-ShiningValiant2
6b2b5694a192cb29ad0e4314138affa25b630c0e
15.458036
llama3.1
16
8.03
true
false
false
false
6.528982
0.267806
26.780609
0.442929
21.61815
0.052115
5.21148
0.302013
6.935123
0.395917
10.789583
0.292719
21.413268
false
false
2024-08-06
2024-11-05
2
meta-llama/Meta-Llama-3.1-8B
ValiantLabs_Llama3.2-3B-Enigma_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.2-3B-Enigma" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.2-3B-Enigma</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.2-3B-Enigma-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.2-3B-Enigma
ca6adf3a289ce47c7598139e7a312e2b4b3708ce
11.692731
llama3.2
7
3.213
true
false
false
false
2.244793
0.278622
27.862183
0.372259
12.434026
0.043807
4.380665
0.261745
1.565996
0.392135
8.05026
0.242769
15.863254
false
false
2024-09-30
2024-10-02
1
meta-llama/Llama-3.2-3B-Instruct
ValiantLabs_Llama3.2-3B-Esper2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.2-3B-Esper2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.2-3B-Esper2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.2-3B-Esper2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.2-3B-Esper2
64a2c619a2e1680ab42945fcf5b75a5242cab3a1
10.944295
llama3.2
3
3.213
true
false
false
false
1.477769
0.274975
27.497484
0.380826
13.851733
0.036254
3.625378
0.270134
2.684564
0.354958
4.036458
0.225731
13.970154
false
false
2024-10-03
2024-10-09
1
meta-llama/Llama-3.2-3B-Instruct
ValiantLabs_Llama3.2-3B-ShiningValiant2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3.2-3B-ShiningValiant2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3.2-3B-ShiningValiant2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ValiantLabs__Llama3.2-3B-ShiningValiant2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3.2-3B-ShiningValiant2
1336e200485675c9b92baae17831eab17c601803
14.390696
llama3.2
3
3.213
true
false
false
false
3.463997
0.26251
26.251014
0.422593
18.912709
0.082326
8.232628
0.280201
4.026846
0.386646
8.597396
0.282912
20.323582
false
false
2024-09-27
2024-11-05
1
meta-llama/Llama-3.2-3B-Instruct
Vikhrmodels_Vikhr-Llama3.1-8B-Instruct-R-21-09-24_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Vikhrmodels/Vikhr-Llama3.1-8B-Instruct-R-21-09-24" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Vikhrmodels/Vikhr-Llama3.1-8B-Instruct-R-21-09-24</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Vikhrmodels__Vikhr-Llama3.1-8B-Instruct-R-21-09-24-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Vikhrmodels/Vikhr-Llama3.1-8B-Instruct-R-21-09-24
c0b57cf6d4444b35fc5cec0525ff5eef32af22c9
25.354951
apache-2.0
29
8.03
true
false
false
true
1.713223
0.643146
64.314574
0.527224
32.669417
0.217523
21.752266
0.244966
0
0.375396
5.091146
0.354721
28.302305
false
false
2024-09-20
2024-09-21
1
Vikhrmodels/Vikhr-Llama3.1-8B-Instruct-R-21-09-24 (Merge)
Vikhrmodels_Vikhr-Nemo-12B-Instruct-R-21-09-24_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Vikhrmodels/Vikhr-Nemo-12B-Instruct-R-21-09-24" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Vikhrmodels/Vikhr-Nemo-12B-Instruct-R-21-09-24</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Vikhrmodels__Vikhr-Nemo-12B-Instruct-R-21-09-24-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Vikhrmodels/Vikhr-Nemo-12B-Instruct-R-21-09-24
6abd887cb631f705042c9e8085615fe4d76e9779
25.019954
apache-2.0
114
12.248
true
false
false
true
3.441584
0.599932
59.993152
0.521231
31.414409
0.17145
17.145015
0.291107
5.480984
0.407302
9.446094
0.339761
26.640071
false
false
2024-09-20
2024-09-21
1
Vikhrmodels/Vikhr-Nemo-12B-Instruct-R-21-09-24 (Merge)
Weyaxi_Bagel-Hermes-2x34B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/Weyaxi/Bagel-Hermes-2x34B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Weyaxi/Bagel-Hermes-2x34B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Weyaxi__Bagel-Hermes-2x34B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Weyaxi/Bagel-Hermes-2x34B
44fddd32d7dcafc0fa670fd87a2e129310640aac
25.611273
apache-2.0
16
60.814
true
true
false
true
19.630336
0.543153
54.315328
0.491666
27.409031
0.060423
6.042296
0.32802
10.402685
0.451667
15.625
0.45886
39.873301
false
false
2024-01-12
2024-10-28
0
Weyaxi/Bagel-Hermes-2x34B
Weyaxi_Bagel-Hermes-34B-Slerp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Weyaxi/Bagel-Hermes-34B-Slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Weyaxi/Bagel-Hermes-34B-Slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Weyaxi__Bagel-Hermes-34B-Slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Weyaxi/Bagel-Hermes-34B-Slerp
dcdcc17a2c650a95bc27129a3ddbf261dffed37f
27.246858
apache-2.0
1
34.389
true
false
false
false
6.042751
0.460272
46.027208
0.59219
41.957047
0.060423
6.042296
0.334732
11.297539
0.462208
17.009375
0.470329
41.14768
true
false
2024-01-12
2024-08-30
0
Weyaxi/Bagel-Hermes-34B-Slerp
Weyaxi_Einstein-v4-7B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Weyaxi/Einstein-v4-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Weyaxi/Einstein-v4-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Weyaxi__Einstein-v4-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Weyaxi/Einstein-v4-7B
7eecd9833b8a012e23ac1df789884888b047baa0
16.755664
other
48
7.242
true
false
false
true
1.335508
0.470813
47.0813
0.384947
14.304451
0.018882
1.888218
0.281879
4.250559
0.468167
19.020833
0.225898
13.988623
false
false
2024-02-22
2024-06-26
1
mistralai/Mistral-7B-v0.1
Weyaxi_Einstein-v6.1-Llama3-8B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Weyaxi/Einstein-v6.1-Llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Weyaxi/Einstein-v6.1-Llama3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Weyaxi__Einstein-v6.1-Llama3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Weyaxi/Einstein-v6.1-Llama3-8B
5cab6d54666b6024d0f745d61abf1842edb934e0
20.169491
other
67
8.03
true
false
false
true
1.719196
0.456825
45.682456
0.50083
29.383773
0.067976
6.797583
0.281879
4.250559
0.421281
11.226823
0.313082
23.675754
false
false
2024-04-19
2024-06-26
1
meta-llama/Meta-Llama-3-8B
Weyaxi_Einstein-v6.1-developed-by-Weyaxi-Llama3-8B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Weyaxi/Einstein-v6.1-developed-by-Weyaxi-Llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Weyaxi/Einstein-v6.1-developed-by-Weyaxi-Llama3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Weyaxi__Einstein-v6.1-developed-by-Weyaxi-Llama3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Weyaxi/Einstein-v6.1-developed-by-Weyaxi-Llama3-8B
b7507e94146c0832c26609e9ab8115934d3e25b3
19.318743
other
1
8.03
true
false
false
true
1.743456
0.392702
39.270247
0.504384
29.694447
0.071752
7.175227
0.27349
3.131991
0.43325
13.389583
0.309259
23.25096
false
false
2024-06-23
2024-06-26
0
Weyaxi/Einstein-v6.1-developed-by-Weyaxi-Llama3-8B
Weyaxi_Einstein-v7-Qwen2-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Weyaxi/Einstein-v7-Qwen2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Weyaxi/Einstein-v7-Qwen2-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Weyaxi__Einstein-v7-Qwen2-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Weyaxi/Einstein-v7-Qwen2-7B
d5a2f245bf98a40d196821bc378e10f35b4da81a
24.806418
other
38
7.616
true
false
false
true
2.627943
0.409963
40.996334
0.516147
32.841819
0.199396
19.939577
0.299497
6.599553
0.439979
14.064063
0.409574
34.397163
false
false
2024-06-24
2024-06-26
1
Qwen/Qwen2-7B
Weyaxi_Einstein-v8-Llama3.2-1B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Weyaxi/Einstein-v8-Llama3.2-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Weyaxi/Einstein-v8-Llama3.2-1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Weyaxi__Einstein-v8-Llama3.2-1B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Weyaxi/Einstein-v8-Llama3.2-1B
1edc6abcb8eedd047bc40b79d2d36c3723ff28e2
4.640409
llama3.2
2
1.236
true
false
false
true
0.775849
0.186223
18.622256
0.301843
3.013774
0.000755
0.075529
0.258389
1.118568
0.361781
3.222656
0.116107
1.789672
false
false
2024-09-28
2024-09-30
1
meta-llama/Llama-3.2-1B
Weyaxi_SauerkrautLM-UNA-SOLAR-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Weyaxi__SauerkrautLM-UNA-SOLAR-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct
9678b9ca952abe0083dbfc772a56b849866bfa1a
20.476008
cc-by-nc-4.0
26
10.732
true
false
false
true
1.492022
0.457324
45.732434
0.516636
31.824687
0.046073
4.607251
0.311242
8.165548
0.397875
8.601042
0.315326
23.925089
true
false
2023-12-21
2024-06-26
0
Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct
WizardLMTeam_WizardLM-13B-V1.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/WizardLMTeam/WizardLM-13B-V1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">WizardLMTeam/WizardLM-13B-V1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/WizardLMTeam__WizardLM-13B-V1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
WizardLMTeam/WizardLM-13B-V1.0
964a93aa2e78da377115bb856075a69ebe8aefa4
4.546092
73
13
true
false
false
false
141.955174
0.185049
18.5049
0.291344
2.147967
0
0
0.259228
1.230425
0.349719
3.548177
0.116606
1.84508
false
true
2023-05-13
2024-06-13
0
WizardLMTeam/WizardLM-13B-V1.0
WizardLMTeam_WizardLM-13B-V1.2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/WizardLMTeam/WizardLM-13B-V1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">WizardLMTeam/WizardLM-13B-V1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/WizardLMTeam__WizardLM-13B-V1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
WizardLMTeam/WizardLM-13B-V1.2
cf5f40382559f19e13874e45b39575171ca46ef8
15.177533
llama2
226
13
true
false
false
false
7.038916
0.339247
33.924653
0.4462
22.888655
0.018882
1.888218
0.260906
1.454139
0.437844
14.030469
0.251912
16.879063
false
true
2023-07-25
2024-06-12
0
WizardLMTeam/WizardLM-13B-V1.2
WizardLMTeam_WizardLM-70B-V1.0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/WizardLMTeam/WizardLM-70B-V1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">WizardLMTeam/WizardLM-70B-V1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/WizardLMTeam__WizardLM-70B-V1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
WizardLMTeam/WizardLM-70B-V1.0
54aaecaff7d0790eb9f0ecea1cc267a94cc66949
22.397442
llama2
235
70
true
false
false
false
58.192127
0.495143
49.514289
0.559037
37.543355
0.039275
3.927492
0.26594
2.12528
0.439115
14.089323
0.344664
27.184914
false
true
2023-08-09
2024-06-12
0
WizardLMTeam/WizardLM-70B-V1.0
Wladastic_Mini-Think-Base-1B_float16
float16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Wladastic/Mini-Think-Base-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Wladastic/Mini-Think-Base-1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Wladastic__Mini-Think-Base-1B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Wladastic/Mini-Think-Base-1B
86460e314d6ca707921baae6864396c18e5c024a
14.34856
llama3.2
1
1.236
true
false
false
true
0.3615
0.558841
55.884054
0.357417
9.377988
0.073263
7.326284
0.263423
1.789709
0.32749
3.136198
0.177194
8.577128
false
false
2025-02-18
2025-02-22
1
Wladastic/Mini-Think-Base-1B (Merge)
Xclbr7_Arcanum-12b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Xclbr7/Arcanum-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Xclbr7/Arcanum-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Xclbr7__Arcanum-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Xclbr7/Arcanum-12b
845ac67d2b527296ae8c06da4453bf8a60f2e59b
20.757226
mit
1
12.248
true
false
false
false
3.381021
0.290686
29.068649
0.526536
31.87996
0.119335
11.933535
0.32047
9.395973
0.417031
13.528906
0.358627
28.736333
false
false
2024-09-17
2024-09-17
0
Xclbr7/Arcanum-12b
Xclbr7_Hyena-12b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Xclbr7/Hyena-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Xclbr7/Hyena-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Xclbr7__Hyena-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Xclbr7/Hyena-12b
9dd5eb77ce8e0e05e260ae4d812631fb980527fa
20.764534
apache-2.0
1
12.248
true
false
false
false
3.719787
0.340446
34.044557
0.545718
34.665649
0.113293
11.329305
0.297819
6.375839
0.398427
11.070052
0.343916
27.101803
false
false
2024-09-19
2024-09-19
1
Xclbr7/Arcanum-12b
Xclbr7_caliburn-12b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Xclbr7/caliburn-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Xclbr7/caliburn-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Xclbr7__caliburn-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Xclbr7/caliburn-12b
f76fa67c7ca8bf7e75540baf55972ba52a46630b
22.946865
mit
0
12.248
true
false
false
false
3.712441
0.357631
35.763109
0.551863
35.636841
0.112538
11.253776
0.336409
11.521253
0.429188
13.781771
0.36752
29.724439
false
false
2024-09-14
2024-09-14
0
Xclbr7/caliburn-12b
Xclbr7_caliburn-v2-12b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Xclbr7/caliburn-v2-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Xclbr7/caliburn-v2-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Xclbr7__caliburn-v2-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Xclbr7/caliburn-v2-12b
fa736b3b852298dd8c047ac6dcc620161df4a79b
20.966113
mit
1
12.248
true
false
false
false
3.264387
0.296682
29.668169
0.514143
30.387967
0.104985
10.498489
0.326342
10.178971
0.437031
14.128906
0.378408
30.934176
false
false
2024-09-16
2024-09-16
0
Xclbr7/caliburn-v2-12b
Xiaojian9992024_Llama3.2-1B-THREADRIPPER_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Xiaojian9992024/Llama3.2-1B-THREADRIPPER" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Xiaojian9992024/Llama3.2-1B-THREADRIPPER</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Xiaojian9992024__Llama3.2-1B-THREADRIPPER-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Xiaojian9992024/Llama3.2-1B-THREADRIPPER
0697f71f487e1a845626f0cfce6df472fe5eb63d
14.088098
0
1.236
false
false
false
true
0.773647
0.557592
55.759163
0.354375
9.11553
0.074018
7.401813
0.260906
1.454139
0.312979
2.322396
0.17628
8.475547
false
false
2025-02-03
2025-02-03
1
Xiaojian9992024/Llama3.2-1B-THREADRIPPER (Merge)
Xiaojian9992024_Llama3.2-1B-THREADRIPPER-v0.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Xiaojian9992024/Llama3.2-1B-THREADRIPPER-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Xiaojian9992024/Llama3.2-1B-THREADRIPPER-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Xiaojian9992024__Llama3.2-1B-THREADRIPPER-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Xiaojian9992024/Llama3.2-1B-THREADRIPPER-v0.2
fa2fbdb6ba0bfa29042d24e33ec10cb6561bf200
13.779235
0
1.236
false
false
false
true
0.687642
0.531788
53.178788
0.352782
8.329663
0.06571
6.570997
0.26594
2.12528
0.331646
4.189063
0.174535
8.281619
false
false
2025-02-03
2025-02-03
1
Xiaojian9992024/Llama3.2-1B-THREADRIPPER-v0.2 (Merge)
Xiaojian9992024_Phi-4-Megatron-Empathetic_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Xiaojian9992024/Phi-4-Megatron-Empathetic" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Xiaojian9992024/Phi-4-Megatron-Empathetic</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Xiaojian9992024__Phi-4-Megatron-Empathetic-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Xiaojian9992024/Phi-4-Megatron-Empathetic
68d1969e3a4a1b68546ba4c521c403b62d65c00a
27.967889
0
14.66
false
false
false
false
3.602238
0.017261
1.726087
0.66734
51.912764
0.269637
26.963746
0.385906
18.120805
0.507135
23.72526
0.508228
45.358673
false
false
2025-02-17
2025-02-17
1
Xiaojian9992024/Phi-4-Megatron-Empathetic (Merge)
Xiaojian9992024_Phi-4-mini-UNOFFICAL_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/Xiaojian9992024/Phi-4-mini-UNOFFICAL" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Xiaojian9992024/Phi-4-mini-UNOFFICAL</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Xiaojian9992024__Phi-4-mini-UNOFFICAL-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Xiaojian9992024/Phi-4-mini-UNOFFICAL
39e3ee6d0335dc047f5e8901ea859b55bfce670e
3.014261
1
3.754
false
false
false
false
0.492023
0.127321
12.732106
0.294444
2.478947
0
0
0.240772
0
0.336823
1.269531
0.114445
1.604979
false
false
2025-01-25
2025-01-25
1
Xiaojian9992024/Phi-4-mini-UNOFFICAL (Merge)
Xiaojian9992024_Qwen2.5-7B-MS-Destroyer_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Xiaojian9992024/Qwen2.5-7B-MS-Destroyer" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Xiaojian9992024/Qwen2.5-7B-MS-Destroyer</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Xiaojian9992024__Qwen2.5-7B-MS-Destroyer-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Xiaojian9992024/Qwen2.5-7B-MS-Destroyer
dae469769a9e4d29a95d58bd18e9379cf96a6d61
35.43373
1
7.613
false
false
false
true
0.689325
0.729574
72.95742
0.54697
35.759655
0.459215
45.92145
0.30453
7.270694
0.427021
12.777604
0.44124
37.915559
false
false
2025-03-01
2025-03-03
1
Xiaojian9992024/Qwen2.5-7B-MS-Destroyer (Merge)
Xiaojian9992024_Qwen2.5-Dyanka-7B-Preview_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Xiaojian9992024/Qwen2.5-Dyanka-7B-Preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Xiaojian9992024/Qwen2.5-Dyanka-7B-Preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Xiaojian9992024__Qwen2.5-Dyanka-7B-Preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Xiaojian9992024/Qwen2.5-Dyanka-7B-Preview
7053bf7b54611e42080d873e0fe766cef471fa14
37.295944
apache-2.0
8
7.616
true
false
false
true
0.620859
0.764021
76.402058
0.554334
36.615172
0.487915
48.791541
0.317114
8.948546
0.448073
15.509115
0.437583
37.509235
true
false
2025-02-25
2025-02-25
1
Xiaojian9992024/Qwen2.5-Dyanka-7B-Preview (Merge)