eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
64 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.09k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.22
0.83
BBH
float64
0.25
76.7
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.7
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
525 values
Submission Date
stringclasses
263 values
Generation
int64
0
10
Base Model
stringlengths
4
102
bunnycore_Phi-4-RP-v0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Phi-4-RP-v0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Phi-4-RP-v0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Phi-4-RP-v0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Phi-4-RP-v0
6ff65d49b76c23122a1d8767e17714db32c58760
38.211808
mit
3
14.66
true
false
false
true
1.846903
0.682713
68.271298
0.685634
54.844985
0.331571
33.1571
0.352349
13.646532
0.414094
10.861719
0.536403
48.489214
true
false
2025-01-10
2025-01-10
1
bunnycore/Phi-4-RP-v0 (Merge)
bunnycore_Phi-4-RR-Shoup_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Phi-4-RR-Shoup" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Phi-4-RR-Shoup</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Phi-4-RR-Shoup-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Phi-4-RR-Shoup
3aee2a7da66705498f159afdf1c077470a7beae7
41.279971
2
14.66
false
false
false
true
1.869728
0.658658
65.865792
0.694703
56.108394
0.499245
49.924471
0.337248
11.63311
0.444042
14.938542
0.542886
49.209515
false
false
2025-02-02
2025-02-02
1
bunnycore/Phi-4-RR-Shoup (Merge)
bunnycore_Phi-4-RStock-v0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Phi-4-RStock-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Phi-4-RStock-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Phi-4-RStock-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Phi-4-RStock-v0.1
47627dfbe666963edc311248551a029e81083dfa
41.102308
0
14.66
false
false
false
true
2.788148
0.701872
70.187214
0.692831
55.976292
0.395015
39.501511
0.364933
15.324385
0.458365
16.728906
0.54006
48.895538
false
false
2025-02-02
2025-02-02
1
bunnycore/Phi-4-RStock-v0.1 (Merge)
bunnycore_Phi-4-ReasoningRP_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Phi-4-ReasoningRP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Phi-4-ReasoningRP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Phi-4-ReasoningRP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Phi-4-ReasoningRP
5b3f6627c717f7bece020b4f17e52cc62e4bfbec
40.953871
mit
2
14.66
true
false
false
true
1.874709
0.67362
67.362044
0.692219
55.884464
0.456949
45.694864
0.34396
12.527964
0.449094
15.136719
0.542055
49.117169
true
false
2025-01-28
2025-01-28
1
bunnycore/Phi-4-ReasoningRP (Merge)
bunnycore_Phi-4-Sce-exp-v0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Phi-4-Sce-exp-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Phi-4-Sce-exp-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Phi-4-Sce-exp-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Phi-4-Sce-exp-v0.1
68f76d7bd85d2b2cfdffa6d2f5a53f0de623563a
41.332281
1
14.66
false
false
false
true
1.858318
0.659532
65.953226
0.694318
56.074962
0.503021
50.302115
0.33557
11.409396
0.444073
15.109115
0.542304
49.144873
false
false
2025-02-07
2025-02-07
1
bunnycore/Phi-4-Sce-exp-v0.1 (Merge)
bunnycore_Phi-4-Stock-Ex_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Phi-4-Stock-Ex" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Phi-4-Stock-Ex</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Phi-4-Stock-Ex-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Phi-4-Stock-Ex
26e75507bcc57a36a6e661fc46f431e9c6ed419d
40.217465
2
14.66
false
false
false
true
1.842296
0.657459
65.745888
0.686446
55.203551
0.40861
40.861027
0.350671
13.422819
0.462365
17.46224
0.537483
48.609264
false
false
2025-01-16
2025-01-16
1
bunnycore/Phi-4-Stock-Ex (Merge)
bunnycore_Phi-4-Stock-RP_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Phi-4-Stock-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Phi-4-Stock-RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Phi-4-Stock-RP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Phi-4-Stock-RP
bf881c1b7f4e6f4b99f5c48fc1008b2cfe9efb64
39.044084
mit
1
14.66
true
false
false
true
2.311146
0.639923
63.992318
0.685963
55.20595
0.34139
34.138973
0.358221
14.42953
0.471479
18.534896
0.531666
47.96284
true
false
2025-01-12
2025-01-12
1
bunnycore/Phi-4-Stock-RP (Merge)
bunnycore_Phi-4-Trim-Exp1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Phi-4-Trim-Exp1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Phi-4-Trim-Exp1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Phi-4-Trim-Exp1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Phi-4-Trim-Exp1
77af94cfcce9fd3c187425a9eaf8f7b36573534f
4.501548
0
7.503
false
false
false
true
0.512342
0.121925
12.192538
0.285166
1.40662
0.005287
0.528701
0.255034
0.671141
0.417688
10.577604
0.114694
1.632683
false
false
2025-02-14
2025-02-14
1
bunnycore/Phi-4-Trim-Exp1 (Merge)
bunnycore_Phi-Seek-4-Sce-V1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Phi-Seek-4-Sce-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Phi-Seek-4-Sce-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Phi-Seek-4-Sce-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Phi-Seek-4-Sce-V1
8eb60bf63862eac59efd75eedd9edfa5142eb9a3
26.222147
0
14.66
false
false
false
true
1.946403
0.293485
29.348462
0.645911
49.252673
0.214502
21.450151
0.276007
3.467562
0.398156
8.002865
0.512301
45.81117
false
false
2025-02-06
0
Removed
bunnycore_Qandora-2.5-7B-Creative_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qandora-2.5-7B-Creative" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qandora-2.5-7B-Creative</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qandora-2.5-7B-Creative-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qandora-2.5-7B-Creative
fdb174364d4a4f323ed1cb01219ac4d87708219d
32.101826
1
7.616
false
false
false
true
1.41959
0.680315
68.03149
0.554176
36.424652
0.305891
30.589124
0.310403
8.053691
0.421188
10.848438
0.447972
38.663564
false
false
2024-11-20
2024-11-20
1
bunnycore/Qandora-2.5-7B-Creative (Merge)
bunnycore_QandoraExp-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/QandoraExp-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/QandoraExp-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__QandoraExp-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/QandoraExp-7B
74906d5518c7feb7df9b168763dabc1b0167942f
36.265
2
7.616
false
false
false
true
1.341055
0.750906
75.090648
0.547796
35.924742
0.47432
47.432024
0.310403
8.053691
0.431208
13.201042
0.440991
37.887855
false
false
2024-11-11
2024-11-11
1
bunnycore/QandoraExp-7B (Merge)
bunnycore_QandoraExp-7B-Persona_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/QandoraExp-7B-Persona" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/QandoraExp-7B-Persona</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__QandoraExp-7B-Persona-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/QandoraExp-7B-Persona
21bd6c2e270358b70f9af98bcccd6ec9c2cfce88
31.693541
2
7.616
false
false
false
true
1.375942
0.624686
62.468583
0.555834
36.832709
0.310423
31.042296
0.314597
8.612975
0.437156
13.344531
0.440741
37.860151
false
false
2024-11-12
2024-11-12
1
bunnycore/QandoraExp-7B-Persona (Merge)
bunnycore_QandoraExp-7B-v2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/QandoraExp-7B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/QandoraExp-7B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__QandoraExp-7B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/QandoraExp-7B-v2
017594240f9b3c4262e23de6d550453a1a3d5540
31.129639
1
7.616
false
false
false
true
1.387299
0.560689
56.068897
0.544486
34.944967
0.471299
47.129909
0.302852
7.04698
0.404542
9.267708
0.390874
32.319371
false
false
2024-11-12
2024-11-12
1
bunnycore/QandoraExp-7B-v2 (Merge)
bunnycore_QwQen-3B-LCoT_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/QwQen-3B-LCoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/QwQen-3B-LCoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__QwQen-3B-LCoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/QwQen-3B-LCoT
a778a78bdd4a2ab58bcbc99269f0673f610e5874
27.986056
2
3.397
false
false
false
true
1.455459
0.602529
60.252907
0.489931
28.499814
0.361782
36.178248
0.266779
2.237136
0.417781
10.75599
0.36993
29.992243
false
false
2024-12-25
2024-12-26
1
bunnycore/QwQen-3B-LCoT (Merge)
bunnycore_QwQen-3B-LCoT-R1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/QwQen-3B-LCoT-R1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/QwQen-3B-LCoT-R1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__QwQen-3B-LCoT-R1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/QwQen-3B-LCoT-R1
f6cc2be1224899c81c0931c58e589780442321e5
25.965029
1
3.085
false
false
false
true
0.810269
0.53416
53.416047
0.47986
26.982869
0.335347
33.534743
0.261745
1.565996
0.413844
10.030469
0.37234
30.260047
false
false
2025-02-23
2025-02-23
1
bunnycore/QwQen-3B-LCoT-R1 (Merge)
bunnycore_Qwen-2.5-7B-Deep-Sky-T1_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen-2.5-7B-Deep-Sky-T1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen-2.5-7B-Deep-Sky-T1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen-2.5-7B-Deep-Sky-T1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen-2.5-7B-Deep-Sky-T1
34278bff581951eb410930a05d097b60a997ef3c
15.009116
0
7.613
false
false
false
true
0.710794
0.420805
42.080458
0.413988
17.866965
0.055136
5.513595
0.28104
4.138702
0.401812
8.193229
0.210356
12.261746
false
false
2025-02-16
2025-02-16
1
bunnycore/Qwen-2.5-7B-Deep-Sky-T1 (Merge)
bunnycore_Qwen-2.5-7B-Deep-Stock-v1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen-2.5-7B-Deep-Stock-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen-2.5-7B-Deep-Stock-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen-2.5-7B-Deep-Stock-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen-2.5-7B-Deep-Stock-v1
d9edb0b221196bf95b06097ee732807d7dc92da0
27.530591
2
7.613
false
false
false
true
1.406863
0.569507
56.950669
0.536134
34.079862
0.26435
26.435045
0.277685
3.691275
0.410896
9.961979
0.406582
34.064716
false
false
2025-01-25
2025-01-25
1
bunnycore/Qwen-2.5-7B-Deep-Stock-v1 (Merge)
bunnycore_Qwen-2.5-7B-Deep-Stock-v4_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen-2.5-7B-Deep-Stock-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen-2.5-7B-Deep-Stock-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen-2.5-7B-Deep-Stock-v4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen-2.5-7B-Deep-Stock-v4
41abad750ddb89571482318273a814bab91b8ff1
36.10175
2
7.613
false
false
false
true
1.329336
0.775286
77.528624
0.545277
35.910014
0.489426
48.942598
0.300336
6.711409
0.412698
10.38724
0.434176
37.130615
false
false
2025-01-26
2025-01-26
1
bunnycore/Qwen-2.5-7B-Deep-Stock-v4 (Merge)
bunnycore_Qwen-2.5-7B-Deep-Stock-v5_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen-2.5-7B-Deep-Stock-v5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen-2.5-7B-Deep-Stock-v5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen-2.5-7B-Deep-Stock-v5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen-2.5-7B-Deep-Stock-v5
4c8027d72ebb17ffb74c2538fa9bf4ae2cec9172
18.507944
3
7.613
false
false
false
true
0.689497
0.450905
45.090471
0.467246
24.990385
0.147281
14.728097
0.270134
2.684564
0.364823
3.202865
0.283162
20.351285
false
false
2025-02-16
2025-02-16
1
bunnycore/Qwen-2.5-7B-Deep-Stock-v5 (Merge)
bunnycore_Qwen-2.5-7B-Exp-Sce_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen-2.5-7B-Exp-Sce" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen-2.5-7B-Exp-Sce</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen-2.5-7B-Exp-Sce-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen-2.5-7B-Exp-Sce
cea556e9f3768a317d746bab5fc830216af373bb
33.86355
2
7.613
false
false
false
true
0.71669
0.76517
76.516975
0.550587
36.239001
0.325529
32.55287
0.298658
6.487696
0.443021
15.177604
0.425864
36.207151
false
false
2025-02-08
2025-02-08
1
bunnycore/Qwen-2.5-7B-Exp-Sce (Merge)
bunnycore_Qwen-2.5-7B-R1-Stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen-2.5-7B-R1-Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen-2.5-7B-R1-Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen-2.5-7B-R1-Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen-2.5-7B-R1-Stock
ce916b3394fb8d741e80f00f710962c10c6623d3
35.31938
2
7.613
false
false
false
true
1.354424
0.757326
75.732612
0.539336
34.850441
0.500755
50.075529
0.299497
6.599553
0.399365
8.053906
0.429438
36.604241
false
false
2025-01-24
2025-01-24
1
bunnycore/Qwen-2.5-7B-R1-Stock (Merge)
bunnycore_Qwen-2.5-7B-Stock-Deep-Bespoke_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen-2.5-7B-Stock-Deep-Bespoke" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen-2.5-7B-Stock-Deep-Bespoke</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen-2.5-7B-Stock-Deep-Bespoke-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen-2.5-7B-Stock-Deep-Bespoke
662c55748486b23fd676f45add9453294e79749a
23.473415
0
7.613
false
false
false
true
1.42669
0.520622
52.062195
0.492035
28.178037
0.188822
18.882175
0.28104
4.138702
0.406802
8.916927
0.357962
28.662456
false
false
2025-01-25
2025-01-25
1
bunnycore/Qwen-2.5-7B-Stock-Deep-Bespoke (Merge)
bunnycore_Qwen-2.5-7b-S1k_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen-2.5-7b-S1k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen-2.5-7b-S1k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen-2.5-7b-S1k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen-2.5-7b-S1k
32b66e4488724dca26655057f62bb3fae9ad11ef
34.592754
2
7.613
false
false
false
true
0.694232
0.716235
71.623514
0.556275
36.694203
0.478097
47.809668
0.284396
4.58613
0.407146
9.259896
0.438248
37.583112
false
false
2025-02-19
2025-02-20
1
bunnycore/Qwen-2.5-7b-S1k (Merge)
bunnycore_Qwen2.5-1.5B-Model-Stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen2.5-1.5B-Model-Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen2.5-1.5B-Model-Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen2.5-1.5B-Model-Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen2.5-1.5B-Model-Stock
6496cbb16e56f68248d38a792a7e51b7505f6cc3
4.199523
0
1.776
false
false
false
true
0.606681
0.182926
18.292575
0.28737
1.430207
0
0
0.259228
1.230425
0.367427
3.128385
0.11004
1.115544
false
false
2025-02-28
0
Removed
bunnycore_Qwen2.5-3B-Model-Stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen2.5-3B-Model-Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen2.5-3B-Model-Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen2.5-3B-Model-Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen2.5-3B-Model-Stock
ee5704f7bbfbcd15c64dc2628a82362d2c8ef016
27.52421
1
3.396
false
false
false
true
1.515812
0.638075
63.807475
0.471248
26.002264
0.379909
37.990937
0.288591
5.145414
0.394156
7.202865
0.324967
24.996306
false
false
2025-01-15
2025-01-15
1
bunnycore/Qwen2.5-3B-Model-Stock (Merge)
bunnycore_Qwen2.5-3B-Model-Stock-v2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen2.5-3B-Model-Stock-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen2.5-3B-Model-Stock-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen2.5-3B-Model-Stock-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen2.5-3B-Model-Stock-v2
5911ddc9c6cc97299cd0bbef75664bdf0493f42d
27.687679
2
3.396
false
false
false
true
1.51426
0.649016
64.901572
0.467748
25.648547
0.386707
38.670695
0.286913
4.9217
0.391458
6.765625
0.326961
25.217937
false
false
2025-01-15
2025-01-15
1
bunnycore/Qwen2.5-3B-Model-Stock-v2 (Merge)
bunnycore_Qwen2.5-3B-Model-Stock-v3.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen2.5-3B-Model-Stock-v3.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen2.5-3B-Model-Stock-v3.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen2.5-3B-Model-Stock-v3.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen2.5-3B-Model-Stock-v3.1
4b56e847600c4be09df262470b1f3602fde13386
27.972764
3
3.396
false
false
false
true
0.766885
0.648092
64.809151
0.473722
26.396632
0.389728
38.97281
0.284396
4.58613
0.396792
7.632292
0.328956
25.439569
false
false
2025-02-25
2025-02-25
1
bunnycore/Qwen2.5-3B-Model-Stock-v3.1 (Merge)
bunnycore_Qwen2.5-3B-Model-Stock-v3.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen2.5-3B-Model-Stock-v3.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen2.5-3B-Model-Stock-v3.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen2.5-3B-Model-Stock-v3.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen2.5-3B-Model-Stock-v3.2
3ac3318182fb329456d69e298a525d836e55c433
27.386759
2
3.396
false
false
false
true
0.763662
0.635302
63.530211
0.472742
26.326938
0.375378
37.537764
0.283557
4.474273
0.392792
6.965625
0.329372
25.485742
false
false
2025-02-25
2025-02-26
1
bunnycore/Qwen2.5-3B-Model-Stock-v3.2 (Merge)
bunnycore_Qwen2.5-3B-Model-Stock-v4.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen2.5-3B-Model-Stock-v4.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen2.5-3B-Model-Stock-v4.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen2.5-3B-Model-Stock-v4.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen2.5-3B-Model-Stock-v4.1
ac75d8b3f0f5bfe0da95889e6b4671b91b3f58a1
27.743274
2
3.396
false
false
false
true
0.77504
0.638075
63.807475
0.482026
27.399951
0.376888
37.688822
0.279362
3.914989
0.394094
7.128385
0.33868
26.520021
false
false
2025-02-28
2025-02-28
1
bunnycore/Qwen2.5-3B-Model-Stock-v4.1 (Merge)
bunnycore_Qwen2.5-3B-RP-Mix_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen2.5-3B-RP-Mix" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen2.5-3B-RP-Mix</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen2.5-3B-RP-Mix-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen2.5-3B-RP-Mix
0e8f3b56f9270fdcdd4badfd7b925dc8fc4902c7
25.505207
4
3.397
false
false
false
true
1.839402
0.572054
57.205437
0.489438
28.305923
0.215257
21.52568
0.27349
3.131991
0.428448
12.55599
0.372756
30.30622
false
false
2024-10-22
2024-10-22
1
bunnycore/Qwen2.5-3B-RP-Mix (Merge)
bunnycore_Qwen2.5-3B-RP-Thinker_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen2.5-3B-RP-Thinker" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen2.5-3B-RP-Thinker</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen2.5-3B-RP-Thinker-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen2.5-3B-RP-Thinker
ffe872591c1fe1c06464779dd2abfddf9ba7b9f8
22.906122
1
3.397
false
false
false
true
1.440127
0.589415
58.941497
0.416413
17.412964
0.335347
33.534743
0.264262
1.901566
0.328729
1.757812
0.314993
23.88815
false
false
2024-12-31
2024-12-31
1
bunnycore/Qwen2.5-3B-RP-Thinker (Merge)
bunnycore_Qwen2.5-3B-RP-Thinker-V2_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen2.5-3B-RP-Thinker-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen2.5-3B-RP-Thinker-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen2.5-3B-RP-Thinker-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen2.5-3B-RP-Thinker-V2
61889a5b40995a556e9c0cb1c64223559a6035a5
27.670372
2
3.397
false
false
false
true
1.440474
0.641997
64.199657
0.467844
25.629507
0.382931
38.293051
0.285235
4.697987
0.398125
7.965625
0.327128
25.236407
false
false
2024-12-31
2024-12-31
1
bunnycore/Qwen2.5-3B-RP-Thinker-V2 (Merge)
bunnycore_Qwen2.5-7B-CyberRombos_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen2.5-7B-CyberRombos" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen2.5-7B-CyberRombos</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen2.5-7B-CyberRombos-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen2.5-7B-CyberRombos
dfd4d30fc6956ffecb9fb3c59fad51875552f7f9
35.950552
2
7.616
false
false
false
true
1.420211
0.751831
75.18307
0.546496
35.884025
0.496224
49.622356
0.30453
7.270694
0.412542
10.067708
0.439079
37.675458
false
false
2024-11-04
2024-11-05
1
bunnycore/Qwen2.5-7B-CyberRombos (Merge)
bunnycore_Qwen2.5-7B-Fuse-Exp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen2.5-7B-Fuse-Exp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen2.5-7B-Fuse-Exp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen2.5-7B-Fuse-Exp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen2.5-7B-Fuse-Exp
8602ca392ec0414cca119fea98a06c20049488b2
26.925298
2
7.616
false
false
false
true
0.675442
0.54685
54.685014
0.510868
29.967151
0.314199
31.41994
0.276007
3.467562
0.457281
16.360156
0.330868
25.651965
false
false
2025-03-12
2025-03-13
1
bunnycore/Qwen2.5-7B-Fuse-Exp (Merge)
bunnycore_Qwen2.5-7B-Instruct-Fusion_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen2.5-7B-Instruct-Fusion" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen2.5-7B-Instruct-Fusion</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen2.5-7B-Instruct-Fusion-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen2.5-7B-Instruct-Fusion
6313c0b3de799ab48720c3b828e322a77cf8d023
33.10123
5
7.616
false
false
false
true
1.328965
0.696202
69.620163
0.54919
36.179859
0.340634
34.063444
0.30453
7.270694
0.429719
12.948177
0.446725
38.525044
false
false
2024-10-31
2024-11-02
1
bunnycore/Qwen2.5-7B-Instruct-Fusion (Merge)
bunnycore_Qwen2.5-7B-Instruct-Merge-Stock-v0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen2.5-7B-Instruct-Merge-Stock-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen2.5-7B-Instruct-Merge-Stock-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen2.5-7B-Instruct-Merge-Stock-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen2.5-7B-Instruct-Merge-Stock-v0.1
f205f75c1cd0436535cf2bca702844877e70781b
36.144831
2
7.613
false
false
false
true
0.680533
0.750906
75.090648
0.552943
36.395231
0.489426
48.942598
0.303691
7.158837
0.423115
11.689323
0.438331
37.592346
false
false
2025-03-01
2025-03-01
1
bunnycore/Qwen2.5-7B-Instruct-Merge-Stock-v0.1 (Merge)
bunnycore_Qwen2.5-7B-MixStock-Sce-V0.3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen2.5-7B-MixStock-Sce-V0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen2.5-7B-MixStock-Sce-V0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen2.5-7B-MixStock-Sce-V0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen2.5-7B-MixStock-Sce-V0.3
94344107260688d9e381cc519f219990fd3dfd6b
11.636394
0
7.613
false
false
false
true
0.726466
0.211976
21.197644
0.347901
9.507336
0.257553
25.755287
0.25755
1.006711
0.371396
3.691146
0.177942
8.660239
false
false
2025-02-13
2025-02-14
1
bunnycore/Qwen2.5-7B-MixStock-Sce-V0.3 (Merge)
bunnycore_Qwen2.5-7B-MixStock-V0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen2.5-7B-MixStock-V0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen2.5-7B-MixStock-V0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen2.5-7B-MixStock-V0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen2.5-7B-MixStock-V0.1
8fb8a90c094c74fce75125c4783fcc817a30243f
33.686597
3
7.613
false
false
false
true
2.128945
0.767343
76.734287
0.54791
35.869258
0.317221
31.722054
0.300336
6.711409
0.441625
14.903125
0.425615
36.179447
false
false
2025-02-05
2025-02-05
1
bunnycore/Qwen2.5-7B-MixStock-V0.1 (Merge)
bunnycore_Qwen2.5-7B-R1-Bespoke-Stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen2.5-7B-R1-Bespoke-Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen2.5-7B-R1-Bespoke-Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen2.5-7B-R1-Bespoke-Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen2.5-7B-R1-Bespoke-Stock
f41a11dd958c01949397118e488fa44288c95483
20.435991
0
7.613
false
false
false
true
1.433776
0.372645
37.264458
0.482214
26.638698
0.204683
20.468278
0.278523
3.803132
0.392635
6.979427
0.347158
27.461953
false
false
2025-01-24
2025-01-25
1
bunnycore/Qwen2.5-7B-R1-Bespoke-Stock (Merge)
bunnycore_Qwen2.5-7B-R1-Bespoke-Task_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen2.5-7B-R1-Bespoke-Task" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen2.5-7B-R1-Bespoke-Task</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen2.5-7B-R1-Bespoke-Task-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen2.5-7B-R1-Bespoke-Task
ff2b303f7fd2acaa20d81b880d626dc8e785f5a0
15.746099
0
7.613
false
false
false
true
1.405181
0.378664
37.866417
0.414955
17.906938
0.178248
17.824773
0.253356
0.447427
0.356885
1.677344
0.268783
18.753694
false
false
2025-01-24
2025-01-25
1
bunnycore/Qwen2.5-7B-R1-Bespoke-Task (Merge)
bunnycore_Qwen2.5-7B-RRP-1M_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen2.5-7B-RRP-1M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen2.5-7B-RRP-1M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen2.5-7B-RRP-1M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen2.5-7B-RRP-1M
c49c82be793bd630230f81d627724af62abdbd1a
33.679286
6
7.613
false
false
false
true
1.361353
0.748134
74.813384
0.545239
35.648526
0.324773
32.477341
0.302852
7.04698
0.44826
15.799219
0.426612
36.290263
false
false
2025-01-26
2025-01-26
1
bunnycore/Qwen2.5-7B-RRP-1M (Merge)
bunnycore_Qwen2.5-7B-RRP-1M-Thinker_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen2.5-7B-RRP-1M-Thinker" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen2.5-7B-RRP-1M-Thinker</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen2.5-7B-RRP-1M-Thinker-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen2.5-7B-RRP-1M-Thinker
ddc20ed7a1d4ce5f1d47dd30181540c2ddc79d8c
12.275363
1
7.613
false
false
false
true
0.710122
0.230811
23.081092
0.348191
9.209373
0.271903
27.190332
0.25755
1.006711
0.376729
4.624479
0.176862
8.540189
false
false
2025-02-14
2025-02-18
1
bunnycore/Qwen2.5-7B-RRP-1M-Thinker (Merge)
bunnycore_Qwen2.5-7B-RRP-ID_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen2.5-7B-RRP-ID" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen2.5-7B-RRP-ID</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen2.5-7B-RRP-ID-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen2.5-7B-RRP-ID
9e5c600d49ae0697dc4584477fdcc941c66f5ef4
35.46322
0
7.616
false
false
false
true
1.50306
0.747259
74.725949
0.547954
36.099189
0.486405
48.640483
0.282718
4.362416
0.417969
11.31276
0.438747
37.63852
false
false
2025-01-27
0
Removed
bunnycore_Qwen2.5-7B-Sky-R1-Mini_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Qwen2.5-7B-Sky-R1-Mini" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Qwen2.5-7B-Sky-R1-Mini</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Qwen2.5-7B-Sky-R1-Mini-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Qwen2.5-7B-Sky-R1-Mini
15902f67ef18fd6d8e325c48c726007bebf98fce
7.371821
0
7.616
false
false
false
false
0.665592
0.230486
23.048622
0.350294
8.895167
0.029456
2.945619
0.28943
5.257271
0.344823
1.269531
0.125332
2.814716
false
false
2025-02-18
2025-02-18
1
bunnycore/Qwen2.5-7B-Sky-R1-Mini (Merge)
bunnycore_QwenMosaic-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/QwenMosaic-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/QwenMosaic-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__QwenMosaic-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/QwenMosaic-7B
1eab0bbe701195ba26f60a284f74e3c6dfe5c139
31.300372
1
7.616
false
false
false
true
1.500574
0.581922
58.192152
0.556413
36.75052
0.444109
44.410876
0.260906
1.454139
0.416385
10.214844
0.431017
36.779699
false
false
2024-12-01
2024-12-02
1
bunnycore/QwenMosaic-7B (Merge)
bunnycore_Smol-Llama-3.2-3B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Smol-Llama-3.2-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Smol-Llama-3.2-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Smol-Llama-3.2-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Smol-Llama-3.2-3B
d66d88bdfb94a879ac3a0ba4891aefb26f4d384f
22.522193
0
3.607
false
false
false
true
1.13141
0.66785
66.785019
0.453881
23.040764
0.138218
13.821752
0.276846
3.579418
0.346
3.15
0.322806
24.756206
false
false
2024-12-29
2024-12-29
1
bunnycore/Smol-Llama-3.2-3B (Merge)
bunnycore_SmolLM2-1.7-Persona_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/SmolLM2-1.7-Persona" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/SmolLM2-1.7-Persona</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__SmolLM2-1.7-Persona-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/SmolLM2-1.7-Persona
ebeaa6f284c044bd54e3e66cc5458d974d92523e
14.527349
apache-2.0
0
1.711
true
false
false
true
0.662665
0.546525
54.652544
0.362321
11.203753
0.056647
5.664653
0.263423
1.789709
0.334125
3.032292
0.19739
10.821144
true
false
2024-11-15
2024-11-15
0
bunnycore/SmolLM2-1.7-Persona
bunnycore_SmolLM2-1.7B-roleplay-lora_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/bunnycore/SmolLM2-1.7B-roleplay-lora" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/SmolLM2-1.7B-roleplay-lora</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__SmolLM2-1.7B-roleplay-lora-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/SmolLM2-1.7B-roleplay-lora
bbab860a4ffdd8e48f600192947ad3504bb0a944
14.47906
apache-2.0
0
3.423
true
false
false
true
1.400051
0.538208
53.820751
0.361034
10.907238
0.05287
5.287009
0.275168
3.355705
0.339458
2.765625
0.196642
10.738032
false
false
2024-11-15
2024-11-15
3
HuggingFaceTB/SmolLM2-1.7B-Instruct (Merge)
bunnycore_Tulu-3.1-8B-SuperNova_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Tulu-3.1-8B-SuperNova" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Tulu-3.1-8B-SuperNova</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bunnycore__Tulu-3.1-8B-SuperNova-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Tulu-3.1-8B-SuperNova
bbbfb910ca8d8f7ae35ecaf4824ad68713bf8d86
30.991376
4
8.03
false
false
false
true
1.387848
0.819375
81.937481
0.525412
32.499171
0.246224
24.622356
0.302013
6.935123
0.3935
8.6875
0.3814
31.266622
false
false
2024-11-22
2024-11-23
1
bunnycore/Tulu-3.1-8B-SuperNova (Merge)
byroneverson_Mistral-Small-Instruct-2409-abliterated_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/byroneverson/Mistral-Small-Instruct-2409-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">byroneverson/Mistral-Small-Instruct-2409-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/byroneverson__Mistral-Small-Instruct-2409-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
byroneverson/Mistral-Small-Instruct-2409-abliterated
5e24aaef2a37f9cb69f70ae9fe714f9d9599fd6e
28.805846
other
13
22.247
true
false
false
true
2.804573
0.697076
69.707598
0.523786
31.2557
0.247734
24.773414
0.333054
11.073826
0.369719
3.548177
0.392287
32.476359
false
false
2024-09-23
2024-10-13
1
mistralai/Mistral-Small-Instruct-2409
byroneverson_Yi-1.5-9B-Chat-16K-abliterated_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/byroneverson/Yi-1.5-9B-Chat-16K-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">byroneverson/Yi-1.5-9B-Chat-16K-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/byroneverson__Yi-1.5-9B-Chat-16K-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
byroneverson/Yi-1.5-9B-Chat-16K-abliterated
84a6eaa723633bbefc7cfac9c64bf0e0a4d39065
26.948135
apache-2.0
4
8.829
true
false
false
true
2.180206
0.552845
55.284534
0.528205
32.843259
0.141239
14.123867
0.312919
8.389262
0.473438
19.679687
0.382314
31.368203
false
false
2024-09-03
2024-09-03
1
01-ai/Yi-1.5-9B-Chat-16K
byroneverson_Yi-1.5-9B-Chat-abliterated_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/byroneverson/Yi-1.5-9B-Chat-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">byroneverson/Yi-1.5-9B-Chat-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/byroneverson__Yi-1.5-9B-Chat-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
byroneverson/Yi-1.5-9B-Chat-abliterated
4e26c200cdf2dc50dd50cdd9fe5b74887e9fa94a
26.270006
apache-2.0
2
8.829
true
false
false
true
1.689145
0.572329
57.23292
0.540122
34.352187
0.166163
16.616314
0.291946
5.592841
0.438865
13.658073
0.371509
30.167701
false
false
2024-09-04
2024-09-17
1
01-ai/Yi-1.5-9B-Chat
c10x_Q-Pluse_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/c10x/Q-Pluse" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">c10x/Q-Pluse</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/c10x__Q-Pluse-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
c10x/Q-Pluse
3.634371
0
7.616
false
false
false
true
2.623382
0.112283
11.228319
0.287511
1.947945
0
0
0.246644
0
0.393812
7.126563
0.113531
1.503398
false
false
2024-10-10
0
Removed
c10x_longthinker_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/c10x/longthinker" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">c10x/longthinker</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/c10x__longthinker-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
c10x/longthinker
e1bb4a2c2782ab52be7a8fa2e5905f08b7cfd464
20.730888
0
8.03
false
false
false
true
1.889549
0.360879
36.087913
0.492749
28.424737
0.231873
23.187311
0.264262
1.901566
0.390958
6.703125
0.352726
28.080674
false
false
2024-10-10
0
Removed
carsenk_flippa-v6_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/carsenk/flippa-v6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">carsenk/flippa-v6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/carsenk__flippa-v6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
carsenk/flippa-v6
5206a32e0bd3067aef1ce90f5528ade7d866253f
20.776367
llama3.1
1
16.061
true
false
false
false
2.129601
0.343943
34.394296
0.504697
29.993501
0.140483
14.048338
0.292785
5.704698
0.408875
10.876042
0.366772
29.641327
false
false
2024-08-24
2024-08-24
2
meta-llama/Meta-Llama-3.1-8B
carsenk_phi3.5_mini_exp_825_uncensored_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/carsenk/phi3.5_mini_exp_825_uncensored" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">carsenk/phi3.5_mini_exp_825_uncensored</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/carsenk__phi3.5_mini_exp_825_uncensored-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
carsenk/phi3.5_mini_exp_825_uncensored
6b208dc3df02e0d5ef0c3fe5899f9f31618f2e94
3.643109
apache-2.0
2
3.821
true
false
false
true
0.975637
0.136414
13.64136
0.296473
1.827813
0.010574
1.057402
0.249161
0
0.364417
3.385417
0.11752
1.946661
false
false
2024-08-29
2024-08-29
2
microsoft/Phi-3.5-mini-instruct
cat-searcher_gemma-2-9b-it-sppo-iter-1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/cat-searcher/gemma-2-9b-it-sppo-iter-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cat-searcher/gemma-2-9b-it-sppo-iter-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cat-searcher__gemma-2-9b-it-sppo-iter-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cat-searcher/gemma-2-9b-it-sppo-iter-1
b29a3a5cef93ee044e2297fcb40bd2976415e900
21.938641
0
9.242
false
false
false
true
5.536079
0.301477
30.147675
0.597187
41.676308
0.083082
8.308157
0.344799
12.639821
0.392667
7.15
0.385389
31.709885
false
false
2024-08-09
0
Removed
cat-searcher_gemma-2-9b-it-sppo-iter-1-evol-1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/cat-searcher/gemma-2-9b-it-sppo-iter-1-evol-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cat-searcher/gemma-2-9b-it-sppo-iter-1-evol-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cat-searcher__gemma-2-9b-it-sppo-iter-1-evol-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cat-searcher/gemma-2-9b-it-sppo-iter-1-evol-1
c2d7b76786151aecfa5972a2a3e937feb2d2c48b
21.525463
0
9.242
false
false
false
true
5.575625
0.294183
29.418277
0.593937
41.10464
0.085347
8.534743
0.340604
12.080537
0.392573
6.904948
0.379987
31.109634
false
false
2024-08-09
0
Removed
cckm_tinymistral_950m_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/cckm/tinymistral_950m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cckm/tinymistral_950m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cckm__tinymistral_950m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cckm/tinymistral_950m
b8ac79e9904405e6cc793101c098561a47b2d0d7
5.219823
mit
2
0.955
true
false
false
false
0.736567
0.239529
23.952889
0.296946
2.371788
0.005287
0.528701
0.260067
1.342282
0.355365
2.053906
0.109624
1.069371
false
false
2025-01-12
2025-01-13
1
cckm/tinymistral_950m (Merge)
cgato_TheSalt-L3-8b-v0.3.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/cgato/TheSalt-L3-8b-v0.3.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cgato/TheSalt-L3-8b-v0.3.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cgato__TheSalt-L3-8b-v0.3.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cgato/TheSalt-L3-8b-v0.3.2
5cf08e2bf9590ebcd14ba021e113def28c65afa2
7.399889
cc-by-nc-4.0
1
8.03
true
false
false
true
1.880588
0.270503
27.050338
0.296797
2.612714
0.047583
4.758308
0.26594
2.12528
0.389625
6.303125
0.113946
1.549572
false
false
2024-06-18
2024-06-26
0
cgato/TheSalt-L3-8b-v0.3.2
chargoddard_prometheus-2-llama-3-8b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/chargoddard/prometheus-2-llama-3-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">chargoddard/prometheus-2-llama-3-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/chargoddard__prometheus-2-llama-3-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
chargoddard/prometheus-2-llama-3-8b
90a728ac98e5b4169f88ae4945e357cf45477568
19.318862
apache-2.0
2
8.03
true
false
false
true
1.890229
0.52889
52.889001
0.493114
27.803839
0.082326
8.232628
0.272651
3.020134
0.339583
0.78125
0.308677
23.186318
true
false
2024-05-26
2024-06-26
1
chargoddard/prometheus-2-llama-3-8b (Merge)
chujiezheng_Llama-3-Instruct-8B-SimPO-ExPO_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/chujiezheng/Llama-3-Instruct-8B-SimPO-ExPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">chujiezheng/Llama-3-Instruct-8B-SimPO-ExPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/chujiezheng__Llama-3-Instruct-8B-SimPO-ExPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
chujiezheng/Llama-3-Instruct-8B-SimPO-ExPO
3fcaa9fe99691659eb197487e9a343f601bf63f2
23.054922
llama3
16
8.03
true
false
false
true
1.440963
0.643371
64.33707
0.476452
25.868282
0.070242
7.024169
0.286913
4.9217
0.39201
9.501302
0.340093
26.677009
false
false
2024-05-26
2024-06-26
0
chujiezheng/Llama-3-Instruct-8B-SimPO-ExPO
chujiezheng_Mistral7B-PairRM-SPPO-ExPO_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/chujiezheng/Mistral7B-PairRM-SPPO-ExPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">chujiezheng/Mistral7B-PairRM-SPPO-ExPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/chujiezheng__Mistral7B-PairRM-SPPO-ExPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
chujiezheng/Mistral7B-PairRM-SPPO-ExPO
d3e8342a63e5ae096f450f2467a92168db12768c
13.617149
apache-2.0
0
7.242
true
false
false
true
1.018068
0.367349
36.734863
0.388219
13.678636
0.018127
1.812689
0.276846
3.579418
0.405531
8.658073
0.255153
17.239214
false
false
2024-05-04
2024-09-21
0
chujiezheng/Mistral7B-PairRM-SPPO-ExPO
cjvt_GaMS-1B_float16
float16
🟩 continuously pretrained
🟩
Original
OPTForCausalLM
<a target="_blank" href="https://huggingface.co/cjvt/GaMS-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cjvt/GaMS-1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cjvt__GaMS-1B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cjvt/GaMS-1B
1620a336e3317ba3fa56586995e46ea9fbadd407
4.62176
apache-2.0
1
1.54
true
false
false
false
0.507882
0.163542
16.354163
0.307475
3.861742
0.013595
1.359517
0.258389
1.118568
0.368417
3.385417
0.11486
1.651152
false
false
2024-09-18
2025-02-13
0
cjvt/GaMS-1B
cloudyu_Llama-3-70Bx2-MOE_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/cloudyu/Llama-3-70Bx2-MOE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cloudyu/Llama-3-70Bx2-MOE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cloudyu__Llama-3-70Bx2-MOE-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cloudyu/Llama-3-70Bx2-MOE
b8bd85e8db8e4ec352b93441c92e0ae1334bf5a7
35.666465
llama3
1
126.926
true
true
false
false
43.07911
0.548249
54.824865
0.663623
51.422138
0.217523
21.752266
0.393456
19.127517
0.481188
20.848437
0.514212
46.023567
false
false
2024-05-20
2024-06-27
0
cloudyu/Llama-3-70Bx2-MOE
cloudyu_Llama-3.2-3Bx4_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/cloudyu/Llama-3.2-3Bx4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cloudyu/Llama-3.2-3Bx4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cloudyu__Llama-3.2-3Bx4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cloudyu/Llama-3.2-3Bx4
d0d893eb5937ba4c3dd4f58471d5ac64334e6ff6
18.997311
0
9.949
false
false
false
false
2.534152
0.506858
50.685847
0.433219
19.793328
0.107251
10.725076
0.277685
3.691275
0.349563
7.028646
0.298537
22.059693
false
false
2025-01-27
0
Removed
cloudyu_Mixtral_11Bx2_MoE_19B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/cloudyu/Mixtral_11Bx2_MoE_19B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cloudyu/Mixtral_11Bx2_MoE_19B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cloudyu__Mixtral_11Bx2_MoE_19B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cloudyu/Mixtral_11Bx2_MoE_19B
39edb16515e431617f7ce69f9b4166b40f97f34b
20.407079
cc-by-nc-4.0
37
19.188
true
true
false
false
1.116447
0.385084
38.50838
0.520852
32.78564
0.067221
6.722054
0.290268
5.369128
0.429688
13.377604
0.331117
25.679669
false
false
2023-12-31
2025-02-16
0
cloudyu/Mixtral_11Bx2_MoE_19B
cloudyu_Mixtral_34Bx2_MoE_60B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/cloudyu/Mixtral_34Bx2_MoE_60B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cloudyu/Mixtral_34Bx2_MoE_60B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cloudyu__Mixtral_34Bx2_MoE_60B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cloudyu/Mixtral_34Bx2_MoE_60B
d01642769ccc782e1db1fc26cb25097aecb98e23
27.611169
apache-2.0
112
60.814
true
true
false
false
14.665177
0.453777
45.377709
0.58697
41.209129
0.077039
7.703927
0.338087
11.744966
0.462521
17.781771
0.476646
41.849512
false
false
2024-01-05
2024-08-22
0
cloudyu/Mixtral_34Bx2_MoE_60B
cloudyu_Mixtral_7Bx2_MoE_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/cloudyu/Mixtral_7Bx2_MoE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cloudyu/Mixtral_7Bx2_MoE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cloudyu__Mixtral_7Bx2_MoE-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cloudyu/Mixtral_7Bx2_MoE
5b7b6efb5110eccbcc752f92413eea22bacdd1c2
21.447316
cc-by-nc-4.0
36
12.879
true
true
false
false
1.56763
0.448007
44.800684
0.515973
32.276641
0.068731
6.873112
0.305369
7.38255
0.447292
14.644792
0.304355
22.706117
false
false
2023-12-22
2024-12-31
0
cloudyu/Mixtral_7Bx2_MoE
cloudyu_S1-Llama-3.2-3Bx4-MoE_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/cloudyu/S1-Llama-3.2-3Bx4-MoE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cloudyu/S1-Llama-3.2-3Bx4-MoE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cloudyu__S1-Llama-3.2-3Bx4-MoE-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cloudyu/S1-Llama-3.2-3Bx4-MoE
a6af7ede4c291fc91dd54ff73fe64df840288367
19.960752
llama3
0
9.555
true
true
false
false
2.711206
0.530214
53.021428
0.435789
20.041555
0.120091
12.009063
0.293624
5.816555
0.345625
6.169792
0.304355
22.706117
false
false
2025-02-05
2025-02-05
0
cloudyu/S1-Llama-3.2-3Bx4-MoE
cloudyu_Yi-34Bx2-MoE-60B-DPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/cloudyu/Yi-34Bx2-MoE-60B-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cloudyu/Yi-34Bx2-MoE-60B-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cloudyu__Yi-34Bx2-MoE-60B-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cloudyu/Yi-34Bx2-MoE-60B-DPO
5c2d31042229ee06246064100b781dd926cb0ffd
26.043502
apache-2.0
3
60.814
true
true
false
true
14.678493
0.531888
53.188761
0.516831
31.259298
0.070242
7.024169
0.322148
9.619687
0.437469
14.316927
0.46767
40.852172
false
false
2024-01-23
2024-08-06
0
cloudyu/Yi-34Bx2-MoE-60B-DPO
cluebbers_Llama-3.1-8B-paraphrase-type-generation-apty-ipo_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/cluebbers/Llama-3.1-8B-paraphrase-type-generation-apty-ipo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cluebbers/Llama-3.1-8B-paraphrase-type-generation-apty-ipo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cluebbers__Llama-3.1-8B-paraphrase-type-generation-apty-ipo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cluebbers/Llama-3.1-8B-paraphrase-type-generation-apty-ipo
eb04613997875935cb667a517e518874bb716169
10.051762
apache-2.0
0
8.03
true
false
false
false
1.439662
0.132667
13.266688
0.380022
12.669478
0.024924
2.492447
0.263423
1.789709
0.433219
12.41901
0.259059
17.673242
false
false
2024-11-14
2024-11-18
1
cluebbers/Llama-3.1-8B-paraphrase-type-generation-apty-ipo (Merge)
cluebbers_Llama-3.1-8B-paraphrase-type-generation-apty-sigmoid_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/cluebbers/Llama-3.1-8B-paraphrase-type-generation-apty-sigmoid" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cluebbers/Llama-3.1-8B-paraphrase-type-generation-apty-sigmoid</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cluebbers__Llama-3.1-8B-paraphrase-type-generation-apty-sigmoid-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cluebbers/Llama-3.1-8B-paraphrase-type-generation-apty-sigmoid
2c8b52e8db11a6ff57cccf890ee26688e858f9fb
10.070699
apache-2.0
0
8.03
true
false
false
false
1.44712
0.131842
13.18424
0.37889
12.757325
0.026435
2.643505
0.268456
2.46085
0.430552
12.01901
0.256233
17.359264
false
false
2024-11-15
2024-11-18
1
cluebbers/Llama-3.1-8B-paraphrase-type-generation-apty-sigmoid (Merge)
cluebbers_Llama-3.1-8B-paraphrase-type-generation-etpc_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/cluebbers/Llama-3.1-8B-paraphrase-type-generation-etpc" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cluebbers/Llama-3.1-8B-paraphrase-type-generation-etpc</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cluebbers__Llama-3.1-8B-paraphrase-type-generation-etpc-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cluebbers/Llama-3.1-8B-paraphrase-type-generation-etpc
a003a227aed5c1ad67cd4a653b13a0dd7acb7ed5
9.681788
apache-2.0
0
8.03
true
false
false
false
1.483054
0.120852
12.085156
0.378081
12.694579
0.019637
1.963746
0.265101
2.013423
0.431854
12.048437
0.255568
17.285387
false
false
2024-11-04
2024-11-18
1
cluebbers/Llama-3.1-8B-paraphrase-type-generation-etpc (Merge)
cognitivecomputations_Dolphin3.0-Llama3.1-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/cognitivecomputations/Dolphin3.0-Llama3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/Dolphin3.0-Llama3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__Dolphin3.0-Llama3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cognitivecomputations/Dolphin3.0-Llama3.1-8B
0bf45a981ba100596ee0c3e7d27e7849b0206632
25.269844
llama3.1
155
8.03
true
false
false
true
1.238651
0.762122
76.212228
0.491637
27.631703
0.123112
12.311178
0.282718
4.362416
0.365344
8.967969
0.299202
22.13357
false
true
2024-12-29
2025-01-05
1
cognitivecomputations/Dolphin3.0-Llama3.1-8B (Merge)
cognitivecomputations_Dolphin3.0-Llama3.2-1B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/cognitivecomputations/Dolphin3.0-Llama3.2-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/Dolphin3.0-Llama3.2-1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__Dolphin3.0-Llama3.2-1B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cognitivecomputations/Dolphin3.0-Llama3.2-1B
100ed6fbb590630622f795adc792f38df1c5b2f7
11.140988
llama3.2
23
1.236
true
false
false
true
2.008096
0.542779
54.277872
0.312225
4.657279
0.027946
2.794562
0.229866
0
0.324885
0.94401
0.13755
4.172207
false
true
2024-12-29
2025-03-12
1
cognitivecomputations/Dolphin3.0-Llama3.2-1B (Merge)
cognitivecomputations_Dolphin3.0-Qwen2.5-0.5B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/cognitivecomputations/Dolphin3.0-Qwen2.5-0.5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/Dolphin3.0-Qwen2.5-0.5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__Dolphin3.0-Qwen2.5-0.5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cognitivecomputations/Dolphin3.0-Qwen2.5-0.5B
7111aeb3bedf7414baaf1532e84d470519c667f2
10.626273
apache-2.0
12
0.494
true
false
false
true
0.824517
0.469714
46.971369
0.311422
5.096928
0.05136
5.135952
0.234899
0
0.355458
1.965625
0.14129
4.587766
false
true
2024-12-29
2025-03-02
1
cognitivecomputations/Dolphin3.0-Qwen2.5-0.5B (Merge)
cognitivecomputations_Dolphin3.0-R1-Mistral-24B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/cognitivecomputations/Dolphin3.0-R1-Mistral-24B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/Dolphin3.0-R1-Mistral-24B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__Dolphin3.0-R1-Mistral-24B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cognitivecomputations/Dolphin3.0-R1-Mistral-24B
34368009d6122e9ef796826bc0ca3989a47ea33e
23.513142
165
23.572
true
false
false
true
2.953672
0.406816
40.681614
0.53597
33.763678
0.311934
31.193353
0.294463
5.928412
0.395177
7.230469
0.300532
22.281324
false
true
2025-02-06
2025-02-07
1
cognitivecomputations/Dolphin3.0-R1-Mistral-24B (Merge)
cognitivecomputations_dolphin-2.9-llama3-8b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9-llama3-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9-llama3-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9-llama3-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cognitivecomputations/dolphin-2.9-llama3-8b
5aeb036f9215c558b483a654a8c6e1cc22e841bf
18.415461
other
444
8.03
true
false
false
true
1.47824
0.385034
38.503393
0.494992
27.858929
0.057402
5.740181
0.286913
4.9217
0.437531
13.791406
0.277094
19.677157
false
true
2024-04-20
2024-06-12
1
meta-llama/Meta-Llama-3-8B
cognitivecomputations_dolphin-2.9.1-llama-3-70b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.1-llama-3-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.1-llama-3-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.1-llama-3-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cognitivecomputations/dolphin-2.9.1-llama-3-70b
31adf616c3c9176d147e0a62e9fedb7bf97678ac
25.534386
llama3
42
70.554
true
false
false
true
24.298176
0.376017
37.601675
0.520492
31.101152
0.182024
18.202417
0.308725
7.829978
0.497562
23.695312
0.412982
34.775783
false
true
2024-05-22
2024-06-27
1
meta-llama/Meta-Llama-3-70B
cognitivecomputations_dolphin-2.9.1-yi-1.5-34b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.1-yi-1.5-34b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.1-yi-1.5-34b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.1-yi-1.5-34b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cognitivecomputations/dolphin-2.9.1-yi-1.5-34b
1ec522298a6935c881df6dc29d3669833bd8672d
28.307204
apache-2.0
35
34.389
true
false
false
true
5.985307
0.385259
38.525889
0.607623
44.174089
0.186556
18.655589
0.343121
12.416107
0.459792
16.973958
0.451878
39.097592
false
true
2024-05-18
2024-07-27
1
01-ai/Yi-1.5-34B
cognitivecomputations_dolphin-2.9.1-yi-1.5-9b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.1-yi-1.5-9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.1-yi-1.5-9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.1-yi-1.5-9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cognitivecomputations/dolphin-2.9.1-yi-1.5-9b
91f0a521e3e2a0675a3549aa5d3f40717068de94
25.639724
apache-2.0
26
8.829
true
false
false
true
2.101731
0.446533
44.653298
0.548431
35.77609
0.151813
15.181269
0.338087
11.744966
0.434802
13.516927
0.396692
32.965795
false
true
2024-05-18
2024-08-02
1
01-ai/Yi-1.5-9B
cognitivecomputations_dolphin-2.9.2-Phi-3-Medium_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.2-Phi-3-Medium" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.2-Phi-3-Medium</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.2-Phi-3-Medium-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cognitivecomputations/dolphin-2.9.2-Phi-3-Medium
0470c5b912b51fa6e27d87a8ea7feafacd8cb101
28.614516
mit
22
-1
true
false
false
true
1.680964
0.424776
42.477626
0.645674
49.72194
0.182779
18.277946
0.327181
10.290828
0.419052
11.414844
0.455535
39.503915
false
true
2024-05-31
2024-08-05
1
cognitivecomputations/dolphin-2.9.2-Phi-3-Medium (Merge)
cognitivecomputations_dolphin-2.9.2-Phi-3-Medium-abliterated_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.2-Phi-3-Medium-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.2-Phi-3-Medium-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.2-Phi-3-Medium-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cognitivecomputations/dolphin-2.9.2-Phi-3-Medium-abliterated
d50be5f22ca9745a2a3175996611d6a840318b7f
25.590064
mit
18
13.96
true
false
false
false
0.843954
0.361254
36.12537
0.612323
45.441267
0.123867
12.386707
0.32802
10.402685
0.411177
10.363802
0.449385
38.820553
false
true
2024-06-03
2024-06-27
1
cognitivecomputations/dolphin-2.9.2-Phi-3-Medium-abliterated (Merge)
cognitivecomputations_dolphin-2.9.2-Phi-3-Medium-abliterated_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.2-Phi-3-Medium-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.2-Phi-3-Medium-abliterated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.2-Phi-3-Medium-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cognitivecomputations/dolphin-2.9.2-Phi-3-Medium-abliterated
d50be5f22ca9745a2a3175996611d6a840318b7f
28.538872
mit
18
13.96
true
false
false
true
1.641593
0.412361
41.236142
0.638289
48.385347
0.182024
18.202417
0.328859
10.514541
0.434927
13.732552
0.45246
39.162234
false
true
2024-06-03
2024-08-05
1
cognitivecomputations/dolphin-2.9.2-Phi-3-Medium-abliterated (Merge)
cognitivecomputations_dolphin-2.9.2-qwen2-72b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.2-qwen2-72b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.2-qwen2-72b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.2-qwen2-72b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cognitivecomputations/dolphin-2.9.2-qwen2-72b
e79582577c2bf2af304221af0e8308b7e7d46ca1
36.978928
other
158
72
true
false
false
true
37.747001
0.634378
63.43779
0.629636
47.696174
0.280211
28.021148
0.369966
15.995526
0.452073
17.042448
0.547124
49.680482
false
true
2024-05-27
2024-10-20
1
Qwen/Qwen2-72B
cognitivecomputations_dolphin-2.9.2-qwen2-7b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.2-qwen2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.2-qwen2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.2-qwen2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cognitivecomputations/dolphin-2.9.2-qwen2-7b
c443c4eb5138ed746ac49ed98bf3c183dc5380ac
21.272082
apache-2.0
67
7.616
true
false
false
true
2.558395
0.35346
35.345993
0.489383
27.914875
0.134441
13.444109
0.290268
5.369128
0.419146
11.659896
0.405086
33.898493
false
true
2024-05-24
2024-07-10
1
Qwen/Qwen2-7B
cognitivecomputations_dolphin-2.9.3-Yi-1.5-34B-32k_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.3-Yi-1.5-34B-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.3-Yi-1.5-34B-32k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.3-Yi-1.5-34B-32k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cognitivecomputations/dolphin-2.9.3-Yi-1.5-34B-32k
ff4eee6438194a670a95dff3118b5231eb568610
27.098383
apache-2.0
18
34
true
false
false
true
6.490521
0.363927
36.39266
0.6047
43.406476
0.166918
16.691843
0.343121
12.416107
0.431052
13.348177
0.463015
40.335033
false
true
2024-06-23
2024-07-27
1
01-ai/Yi-1.5-34B-32k
cognitivecomputations_dolphin-2.9.3-mistral-7B-32k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.3-mistral-7B-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.3-mistral-7B-32k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.3-mistral-7B-32k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cognitivecomputations/dolphin-2.9.3-mistral-7B-32k
4f4273ee8e7930dd64e2c6121c79d12546b883e2
19.348696
apache-2.0
52
7.248
true
false
false
true
1.200165
0.412636
41.263625
0.481254
26.906354
0.050604
5.060423
0.285235
4.697987
0.46426
17.932552
0.282081
20.231235
false
true
2024-06-25
2024-07-04
1
mistralai/Mistral-7B-v0.3
cognitivecomputations_dolphin-2.9.3-mistral-nemo-12b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.3-mistral-nemo-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.3-mistral-nemo-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.3-mistral-nemo-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cognitivecomputations/dolphin-2.9.3-mistral-nemo-12b
7b535c900688fc836fbeebaeb7133910b09bafda
24.972431
apache-2.0
100
12.248
true
false
false
true
2.750285
0.560089
56.008945
0.548037
36.082759
0.074018
7.401813
0.315436
8.724832
0.44299
15.207031
0.337683
26.409205
false
true
2024-07-23
2024-07-26
1
mistralai/Mistral-Nemo-Base-2407
cognitivecomputations_dolphin-2.9.4-gemma2-2b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.4-gemma2-2b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.4-gemma2-2b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.4-gemma2-2b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cognitivecomputations/dolphin-2.9.4-gemma2-2b
5c0854beb88a6711221771d1b13d51f733e6ca06
9.835205
gemma
36
2.614
true
false
false
true
3.022496
0.089551
8.955128
0.408132
17.367633
0.049094
4.909366
0.284396
4.58613
0.417969
10.91276
0.210522
12.280216
false
true
2024-08-24
2024-08-25
1
google/gemma-2-2b
cognitivecomputations_dolphin-2.9.4-llama3.1-8b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/cognitivecomputations/dolphin-2.9.4-llama3.1-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cognitivecomputations/dolphin-2.9.4-llama3.1-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cognitivecomputations__dolphin-2.9.4-llama3.1-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cognitivecomputations/dolphin-2.9.4-llama3.1-8b
7b73d1b7760bf9abac168de3d388b30d1ca1a138
7.131861
llama3.1
96
8.03
true
false
false
true
2.637556
0.275724
27.572397
0.352363
8.972089
0.012085
1.208459
0.263423
1.789709
0.323615
0.61849
0.12367
2.630024
false
true
2024-08-04
2024-09-17
1
meta-llama/Meta-Llama-3.1-8B
collaiborateorg_Collaiborator-MEDLLM-Llama-3-8B-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/collaiborateorg/Collaiborator-MEDLLM-Llama-3-8B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">collaiborateorg/Collaiborator-MEDLLM-Llama-3-8B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/collaiborateorg__Collaiborator-MEDLLM-Llama-3-8B-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
collaiborateorg/Collaiborator-MEDLLM-Llama-3-8B-v2
2560556d655d0ecaefec10f579c92292d65fb28b
17.939047
0
8.03
false
false
false
false
1.411578
0.380887
38.088716
0.464803
23.648503
0.056647
5.664653
0.333054
11.073826
0.343427
1.595052
0.348072
27.563534
false
false
2024-06-27
0
Removed
cpayne1303_cp2024_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/cpayne1303/cp2024" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cpayne1303/cp2024</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cpayne1303__cp2024-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cpayne1303/cp2024
fb354aaa73c40b4f1fc6e86beea733e4f3929470
3.702133
apache-2.0
0
0.031
true
false
false
false
0.095226
0.165814
16.581448
0.298539
2.739141
0.005287
0.528701
0.255872
0.782998
0.338313
0.455729
0.110123
1.124778
false
false
2024-11-26
2024-11-26
0
cpayne1303/cp2024
cpayne1303_cp2024-instruct_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/cpayne1303/cp2024-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cpayne1303/cp2024-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cpayne1303__cp2024-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cpayne1303/cp2024-instruct
ac4cfbc28479f8a94e3eb745526620be9b75edfa
4.319731
apache-2.0
1
0.031
true
false
false
true
0.064324
0.170611
17.061065
0.294678
2.4813
0
0
0.260067
1.342282
0.368635
3.179427
0.116689
1.854314
false
false
2024-11-27
2024-11-27
1
cpayne1303/cp2024
cpayne1303_llama-43m-beta_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/cpayne1303/llama-43m-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cpayne1303/llama-43m-beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cpayne1303__llama-43m-beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cpayne1303/llama-43m-beta
1f85bec8c3541ed58fc2fcf4e6f98c1c34d72f60
5.288332
apache-2.0
0
0.043
true
false
false
false
0.058392
0.191568
19.156837
0.297678
2.482041
0
0
0.268456
2.46085
0.387177
6.163802
0.113198
1.46646
false
false
2024-11-30
2024-11-30
1
JackFram/llama-68m
cpayne1303_llama-43m-beta_float16
float16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/cpayne1303/llama-43m-beta" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cpayne1303/llama-43m-beta</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cpayne1303__llama-43m-beta-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cpayne1303/llama-43m-beta
1f85bec8c3541ed58fc2fcf4e6f98c1c34d72f60
5.422629
apache-2.0
0
0.043
true
false
false
false
0.119832
0.194891
19.489067
0.296463
2.496048
0.004532
0.453172
0.268456
2.46085
0.388542
6.401042
0.11112
1.235594
false
false
2024-11-30
2024-12-04
1
JackFram/llama-68m
cpayne1303_smallcp2024_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/cpayne1303/smallcp2024" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cpayne1303/smallcp2024</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cpayne1303__smallcp2024-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cpayne1303/smallcp2024
ef995127242553e4126190e7f70f927504834360
3.543848
apache-2.0
0
0.002
true
false
false
false
0.094616
0.158196
15.819581
0.302705
3.118178
0.005287
0.528701
0.230705
0
0.342469
0.533333
0.11137
1.263298
false
false
2024-11-27
2024-11-27
0
cpayne1303/smallcp2024
crestf411_MN-Slush_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/crestf411/MN-Slush" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">crestf411/MN-Slush</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/crestf411__MN-Slush-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
crestf411/MN-Slush
46a0cd7e9355f232bdfe9d21a55b944319e23206
22.136983
22
12.248
false
false
false
false
2.124218
0.407715
40.771486
0.534001
33.156422
0.126888
12.688822
0.323826
9.8434
0.393281
8.49349
0.350814
27.868277
false
false
2024-11-20
2025-01-04
1
crestf411/MN-Slush (Merge)
cstr_llama3.1-8b-spaetzle-v90_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/cstr/llama3.1-8b-spaetzle-v90" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cstr/llama3.1-8b-spaetzle-v90</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/cstr__llama3.1-8b-spaetzle-v90-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cstr/llama3.1-8b-spaetzle-v90
717e5c3d31ed2465cd7cf927327adf677a9420b5
27.855367
llama3
2
8.03
true
false
false
true
1.557816
0.735619
73.561927
0.530286
32.763666
0.149547
14.954683
0.282718
4.362416
0.413437
11.146354
0.373088
30.343159
true
false
2024-09-15
2024-09-15
1
cstr/llama3.1-8b-spaetzle-v90 (Merge)