eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
64 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.09k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.22
0.83
BBH
float64
0.25
76.7
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.7
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
525 values
Submission Date
stringclasses
263 values
Generation
int64
0
10
Base Model
stringlengths
4
102
johnsutor_Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.01_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.01
4c30fdbe0708afefe50788ea640c3dfab294c77f
19.06061
apache-2.0
0
8.03
true
false
false
false
1.801262
0.291311
29.13115
0.49183
28.219373
0.010574
1.057402
0.300336
6.711409
0.497677
21.976302
0.345412
27.268026
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.01 (Merge)
johnsutor_Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.1
378a7cad3e34a1a8b11e77edd95b02ff0d228da2
21.361497
apache-2.0
0
8.03
true
false
false
false
1.859824
0.416233
41.623337
0.513861
30.841602
0.077795
7.779456
0.29698
6.263982
0.431729
12.499479
0.36245
29.161126
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_breadcrumbs_ties-density-0.9-gamma-0.1 (Merge)
johnsutor_Llama-3-8B-Instruct_dare_linear_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_dare_linear" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_dare_linear</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_dare_linear-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_dare_linear
abb81fd8fdc2ad32f65befcb7ae369c9837cd563
14.123523
apache-2.0
0
8.03
true
false
false
false
1.820512
0.21455
21.454962
0.428281
19.610999
0
0
0.296141
6.152125
0.497927
21.807552
0.241439
15.715499
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_dare_linear (Merge)
johnsutor_Llama-3-8B-Instruct_dare_ties-density-0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_dare_ties-density-0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.1
e7a3a3b955d945f53da8301b958f0b90a28a62d3
11.632495
apache-2.0
0
8.03
true
false
false
false
1.822268
0.189071
18.907056
0.411874
16.858917
0.000755
0.075529
0.271812
2.908277
0.465802
16.991927
0.226479
14.053265
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.1 (Merge)
johnsutor_Llama-3-8B-Instruct_dare_ties-density-0.3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_dare_ties-density-0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.3
6f966d14d7236f3da6d1ea9ce3bd9b20808e02a9
15.968947
apache-2.0
0
8.03
true
false
false
false
1.846769
0.211327
21.132706
0.455857
23.094936
0.001511
0.151057
0.29698
6.263982
0.506948
22.501823
0.304023
22.669178
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.3 (Merge)
johnsutor_Llama-3-8B-Instruct_dare_ties-density-0.7_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_dare_ties-density-0.7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.7
b14b5cd07feb749e42b0567b1e387b390bed033e
16.77203
apache-2.0
0
8.03
true
false
false
false
2.075424
0.203384
20.338369
0.472286
25.253546
0.003021
0.302115
0.303691
7.158837
0.51101
23.709635
0.314827
23.869681
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.7 (Merge)
johnsutor_Llama-3-8B-Instruct_dare_ties-density-0.9_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.9</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_dare_ties-density-0.9-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.9
17.30977
apache-2.0
0
8.03
true
false
false
false
2.659321
0.216073
21.607335
0.466396
24.687623
0.001511
0.151057
0.307886
7.718121
0.523042
25.880208
0.314328
23.814273
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_dare_ties-density-0.9 (Merge)
johnsutor_Llama-3-8B-Instruct_linear_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_linear" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_linear</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_linear-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_linear
7449157fbc2e8b02e5b6e8ad56b4b2bd7ea82e9d
21.370873
apache-2.0
0
8.03
true
false
false
false
1.652212
0.430821
43.082133
0.50315
28.778577
0.100453
10.045317
0.295302
6.040268
0.409719
10.148177
0.371177
30.130762
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_linear (Merge)
johnsutor_Llama-3-8B-Instruct_ties-density-0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_ties-density-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_ties-density-0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_ties-density-0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_ties-density-0.1
84793f89ebe3be5b5bd9a797d4bbdf374c07419d
20.428512
apache-2.0
0
8.03
true
false
false
false
1.572722
0.411612
41.16123
0.502145
28.768719
0.079305
7.930514
0.288591
5.145414
0.417375
10.671875
0.36004
28.893322
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_ties-density-0.1 (Merge)
johnsutor_Llama-3-8B-Instruct_ties-density-0.3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_ties-density-0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_ties-density-0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_ties-density-0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_ties-density-0.3
8d051f3eec3fc93a4521073c2d290c4ff9144fc1
18.85497
apache-2.0
0
8.03
true
false
false
false
1.914671
0.362628
36.262783
0.490611
27.724507
0.067221
6.722054
0.296141
6.152125
0.40249
10.477865
0.332114
25.790485
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_ties-density-0.3 (Merge)
johnsutor_Llama-3-8B-Instruct_ties-density-0.5_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_ties-density-0.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_ties-density-0.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_ties-density-0.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_ties-density-0.5
c857e33c30016960f114e3a049f5dae41d68bfe7
18.221596
apache-2.0
0
8.03
true
false
false
false
1.683866
0.379664
37.966374
0.479312
26.012097
0.061178
6.117825
0.30453
7.270694
0.387979
7.797396
0.317487
24.165189
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_ties-density-0.5 (Merge)
johnsutor_Llama-3-8B-Instruct_ties-density-0.7_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_ties-density-0.7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_ties-density-0.7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_ties-density-0.7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_ties-density-0.7
8d7d8bbb1e8cba5e51337f97bc3d6d8ae40544d5
18.056543
apache-2.0
0
8.03
true
false
false
false
1.796102
0.368123
36.812325
0.473819
25.371408
0.067221
6.722054
0.309564
7.941834
0.388073
7.575781
0.315243
23.915854
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_ties-density-0.7 (Merge)
johnsutor_Llama-3-8B-Instruct_ties-density-0.9_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/johnsutor/Llama-3-8B-Instruct_ties-density-0.9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">johnsutor/Llama-3-8B-Instruct_ties-density-0.9</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/johnsutor__Llama-3-8B-Instruct_ties-density-0.9-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
johnsutor/Llama-3-8B-Instruct_ties-density-0.9
57c280ce43fe81a23c966b48de6db7f4a85383a3
18.135851
apache-2.0
0
8.03
true
false
false
false
1.802116
0.385809
38.580854
0.473543
25.463735
0.061934
6.193353
0.299497
6.599553
0.388042
7.738542
0.318152
24.239066
true
false
2024-06-07
2024-06-26
1
johnsutor/Llama-3-8B-Instruct_ties-density-0.9 (Merge)
jpacifico_Chocolatine-14B-Instruct-4k-DPO_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/jpacifico/Chocolatine-14B-Instruct-4k-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jpacifico/Chocolatine-14B-Instruct-4k-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jpacifico__Chocolatine-14B-Instruct-4k-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jpacifico/Chocolatine-14B-Instruct-4k-DPO
30677e58010979af26b70240846fdf7ff38cbbf2
30.316421
mit
1
13.96
true
false
false
false
9.896402
0.468865
46.886483
0.629958
48.020722
0.178248
17.824773
0.341443
12.192394
0.443885
15.152344
0.476396
41.821809
false
false
2024-08-01
2024-08-08
0
jpacifico/Chocolatine-14B-Instruct-4k-DPO
jpacifico_Chocolatine-14B-Instruct-DPO-v1.2_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/jpacifico/Chocolatine-14B-Instruct-DPO-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jpacifico/Chocolatine-14B-Instruct-DPO-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jpacifico__Chocolatine-14B-Instruct-DPO-v1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jpacifico/Chocolatine-14B-Instruct-DPO-v1.2
d34bbd55b48e553f28579d86f3ccae19726c6b39
33.795811
mit
14
13.96
true
false
false
true
3.081207
0.685211
68.52108
0.643841
49.845064
0.209215
20.92145
0.325503
10.067114
0.426771
12.346354
0.469664
41.073803
false
false
2024-08-12
2024-08-28
0
jpacifico/Chocolatine-14B-Instruct-DPO-v1.2
jpacifico_Chocolatine-14B-Instruct-DPO-v1.3_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/jpacifico/Chocolatine-14B-Instruct-DPO-v1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jpacifico/Chocolatine-14B-Instruct-DPO-v1.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jpacifico__Chocolatine-14B-Instruct-DPO-v1.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jpacifico/Chocolatine-14B-Instruct-DPO-v1.3
145732abae63ecdcae9770d47b5c29dd67550837
42.420491
mit
0
14.66
true
false
false
true
1.696518
0.703995
70.39954
0.684613
54.846486
0.561934
56.193353
0.341443
12.192394
0.423396
12.291146
0.5374
48.60003
false
false
2024-12-22
2025-01-20
1
jpacifico/Chocolatine-14B-Instruct-DPO-v1.3 (Merge)
jpacifico_Chocolatine-2-14B-Instruct-DPO-v2.0b1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/jpacifico/Chocolatine-2-14B-Instruct-DPO-v2.0b1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jpacifico/Chocolatine-2-14B-Instruct-DPO-v2.0b1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jpacifico__Chocolatine-2-14B-Instruct-DPO-v2.0b1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jpacifico/Chocolatine-2-14B-Instruct-DPO-v2.0b1
023a26b311482d3d849684b2f0e235779d0a9d67
27.97148
0
14.66
false
false
false
false
1.677998
0.10334
10.334025
0.669567
52.018836
0.27568
27.567976
0.375839
16.778523
0.44674
15.309115
0.512384
45.820405
false
false
2025-01-22
0
Removed
jpacifico_Chocolatine-2-14B-Instruct-v2.0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/jpacifico/Chocolatine-2-14B-Instruct-v2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jpacifico/Chocolatine-2-14B-Instruct-v2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jpacifico__Chocolatine-2-14B-Instruct-v2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jpacifico/Chocolatine-2-14B-Instruct-v2.0
46123d125c3f2a7f05e96bd6f25d4acd9e6c1bb8
33.391325
mit
1
14.66
true
false
false
false
1.861754
0.088527
8.852733
0.676993
53.420173
0.480363
48.036254
0.387584
18.344519
0.502115
23.897656
0.53017
47.796616
false
false
2025-02-05
2025-02-05
0
jpacifico/Chocolatine-2-14B-Instruct-v2.0
jpacifico_Chocolatine-2-14B-Instruct-v2.0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/jpacifico/Chocolatine-2-14B-Instruct-v2.0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jpacifico/Chocolatine-2-14B-Instruct-v2.0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jpacifico__Chocolatine-2-14B-Instruct-v2.0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jpacifico/Chocolatine-2-14B-Instruct-v2.0.1
2d682f3fcc93c15dcca03ea11e8663e39eac4c53
33.080573
0
14.66
false
false
false
false
1.80694
0.074214
7.42142
0.673628
52.901491
0.479607
47.960725
0.391779
18.903803
0.50075
23.527083
0.52992
47.768913
false
false
2025-02-03
0
Removed
jpacifico_Chocolatine-2-14B-Instruct-v2.0.3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/jpacifico/Chocolatine-2-14B-Instruct-v2.0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jpacifico/Chocolatine-2-14B-Instruct-v2.0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jpacifico__Chocolatine-2-14B-Instruct-v2.0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jpacifico/Chocolatine-2-14B-Instruct-v2.0.3
6b576e8845f27d3472e522eca31f962bf16648b6
41.32785
apache-2.0
11
14.766
true
false
false
false
3.812492
0.703721
70.372057
0.654803
50.631345
0.420695
42.069486
0.379195
17.225951
0.476813
19.068229
0.5374
48.60003
false
false
2025-02-06
2025-02-06
0
jpacifico/Chocolatine-2-14B-Instruct-v2.0.3
jpacifico_Chocolatine-2-14B-Instruct-v2.0b2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/jpacifico/Chocolatine-2-14B-Instruct-v2.0b2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jpacifico/Chocolatine-2-14B-Instruct-v2.0b2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jpacifico__Chocolatine-2-14B-Instruct-v2.0b2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jpacifico/Chocolatine-2-14B-Instruct-v2.0b2
961454f7242d9a35b85c40bf7ca37821edb8edc2
41.246359
mit
6
14.766
true
false
false
false
3.728794
0.724079
72.407878
0.647582
49.578491
0.395015
39.501511
0.383389
17.785235
0.48075
19.660417
0.536902
48.544622
false
false
2025-01-23
2025-01-24
0
jpacifico/Chocolatine-2-14B-Instruct-v2.0b2
jpacifico_Chocolatine-2-14B-Instruct-v2.0b3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/jpacifico/Chocolatine-2-14B-Instruct-v2.0b3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jpacifico/Chocolatine-2-14B-Instruct-v2.0b3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jpacifico__Chocolatine-2-14B-Instruct-v2.0b3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jpacifico/Chocolatine-2-14B-Instruct-v2.0b3
73116c04c78e401dd6291f5755b2cdf31aca7068
41.433522
apache-2.0
2
14.766
true
false
false
false
3.850072
0.732297
73.229697
0.646879
49.566511
0.410876
41.087613
0.379195
17.225951
0.478115
19.297656
0.533743
48.193706
false
false
2025-01-25
2025-01-27
0
jpacifico/Chocolatine-2-14B-Instruct-v2.0b3
jpacifico_Chocolatine-3B-Instruct-DPO-Revised_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/jpacifico/Chocolatine-3B-Instruct-DPO-Revised" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jpacifico/Chocolatine-3B-Instruct-DPO-Revised</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jpacifico__Chocolatine-3B-Instruct-DPO-Revised-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jpacifico/Chocolatine-3B-Instruct-DPO-Revised
c403df6c0f78148cfb477972455cbd859149311a
28.226631
mit
28
3.821
true
false
false
true
1.509449
0.562263
56.226257
0.553998
37.155286
0.180514
18.05136
0.322148
9.619687
0.445344
15.101302
0.398853
33.205895
false
false
2024-07-17
2024-07-19
0
jpacifico/Chocolatine-3B-Instruct-DPO-Revised
jpacifico_Chocolatine-3B-Instruct-DPO-v1.0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/jpacifico/Chocolatine-3B-Instruct-DPO-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jpacifico/Chocolatine-3B-Instruct-DPO-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jpacifico__Chocolatine-3B-Instruct-DPO-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jpacifico/Chocolatine-3B-Instruct-DPO-v1.0
98d049b8f8c305cfba81adae498a95e6b5647d4a
25.429591
apache-2.0
3
3.821
true
false
false
false
1.598446
0.373718
37.37184
0.54714
36.55452
0.178248
17.824773
0.315436
8.724832
0.475479
19.468229
0.3937
32.633348
false
false
2024-07-11
2024-07-11
0
jpacifico/Chocolatine-3B-Instruct-DPO-v1.0
jpacifico_Chocolatine-3B-Instruct-DPO-v1.2_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/jpacifico/Chocolatine-3B-Instruct-DPO-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jpacifico/Chocolatine-3B-Instruct-DPO-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jpacifico__Chocolatine-3B-Instruct-DPO-v1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jpacifico/Chocolatine-3B-Instruct-DPO-v1.2
ebc9de6c266586adb1ec0db31bf050d1cd8fdffe
27.861913
mit
9
3.821
true
false
false
true
1.948889
0.545501
54.550149
0.548718
35.999388
0.204683
20.468278
0.338926
11.856823
0.415427
12.328385
0.387716
31.968454
false
false
2024-08-22
2024-08-28
0
jpacifico/Chocolatine-3B-Instruct-DPO-v1.2
jpacifico_Distilucie-7B-Math-Instruct-DPO-v0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/jpacifico/Distilucie-7B-Math-Instruct-DPO-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jpacifico/Distilucie-7B-Math-Instruct-DPO-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jpacifico__Distilucie-7B-Math-Instruct-DPO-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jpacifico/Distilucie-7B-Math-Instruct-DPO-v0.1
b1ea75c097bf8c64d6ceb0c03140ca346e13692e
11.128778
apache-2.0
0
6.707
true
false
false
false
0.557326
0.30475
30.475028
0.38347
14.914767
0.02568
2.567976
0.299497
6.599553
0.364448
3.222656
0.180934
8.992686
false
false
2025-03-02
2025-03-02
0
jpacifico/Distilucie-7B-Math-Instruct-DPO-v0.1
jpacifico_Lucie-7B-Instruct-DPO-v1.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/jpacifico/Lucie-7B-Instruct-DPO-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jpacifico/Lucie-7B-Instruct-DPO-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jpacifico__Lucie-7B-Instruct-DPO-v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jpacifico/Lucie-7B-Instruct-DPO-v1.1
eea677f0abd298574708d41351b3aeb5cd348756
11.70483
apache-2.0
0
6.707
true
false
false
false
0.483713
0.312094
31.209413
0.378101
14.205403
0.023414
2.34139
0.287752
5.033557
0.401594
8.132552
0.18376
9.306664
false
false
2025-02-17
2025-02-25
0
jpacifico/Lucie-7B-Instruct-DPO-v1.1
jpacifico_Lucie-7B-Instruct-DPO-v1.1.3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/jpacifico/Lucie-7B-Instruct-DPO-v1.1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jpacifico/Lucie-7B-Instruct-DPO-v1.1.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jpacifico__Lucie-7B-Instruct-DPO-v1.1.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jpacifico/Lucie-7B-Instruct-DPO-v1.1.3
32ba7b0321e1050cd473fe4ef598075f6c77532a
10.946784
0
6.707
false
false
false
false
0.478903
0.304475
30.447546
0.3819
14.565627
0.024169
2.416918
0.286074
4.809843
0.381781
4.95599
0.176363
8.484781
false
false
2025-03-04
0
Removed
jpacifico_Lucie-7B-Instruct-Merged-Model_Stock-v1.0_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/jpacifico/Lucie-7B-Instruct-Merged-Model_Stock-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jpacifico/Lucie-7B-Instruct-Merged-Model_Stock-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jpacifico__Lucie-7B-Instruct-Merged-Model_Stock-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jpacifico/Lucie-7B-Instruct-Merged-Model_Stock-v1.0
24219e509c7dcc3afd5012951fb4b190a36c9cba
11.635401
0
6.707
false
false
false
false
0.492422
0.32336
32.33598
0.380202
14.756535
0.024169
2.416918
0.288591
5.145414
0.384385
5.48151
0.187084
9.676049
false
false
2025-02-26
2025-02-26
1
jpacifico/Lucie-7B-Instruct-Merged-Model_Stock-v1.0 (Merge)
jpacifico_Lucie-7B-Instruct-Merged-Model_Stock-v1.1_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/jpacifico/Lucie-7B-Instruct-Merged-Model_Stock-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jpacifico/Lucie-7B-Instruct-Merged-Model_Stock-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jpacifico__Lucie-7B-Instruct-Merged-Model_Stock-v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jpacifico/Lucie-7B-Instruct-Merged-Model_Stock-v1.1
b1e18403e60b62f980715bed673aa34ac8a59c7c
10.899833
0
6.707
false
false
false
false
0.582283
0.301428
30.142799
0.380786
14.68048
0.027946
2.794562
0.282718
4.362416
0.375021
3.844271
0.18617
9.574468
false
false
2025-03-02
0
Removed
jpacifico_Lucie-Boosted-7B-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/jpacifico/Lucie-Boosted-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jpacifico/Lucie-Boosted-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jpacifico__Lucie-Boosted-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jpacifico/Lucie-Boosted-7B-Instruct
520a92d2cf75cd6ddfa6dcbc93c31dbf7f23939f
8.306617
apache-2.0
0
6.707
true
false
false
false
0.990112
0.256615
25.661467
0.346548
10.258061
0.01284
1.283988
0.266779
2.237136
0.369875
3.401042
0.162982
6.998005
false
false
2025-01-16
2025-01-27
0
jpacifico/Lucie-Boosted-7B-Instruct
jsfs11_L3-8B-Stheno-slerp_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/jsfs11/L3-8B-Stheno-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jsfs11/L3-8B-Stheno-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jsfs11__L3-8B-Stheno-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jsfs11/L3-8B-Stheno-slerp
b74450cac91180fcd92d72d60377e2d0a0b1bd11
24.999084
1
8.03
false
false
false
true
1.070459
0.675194
67.519404
0.532568
33.310316
0.098943
9.89426
0.285235
4.697987
0.372542
5.134375
0.364943
29.438165
false
false
2024-06-18
2025-01-07
1
jsfs11/L3-8B-Stheno-slerp (Merge)
jsfs11_MixtureofMerges-MoE-4x7b-v4_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/jsfs11/MixtureofMerges-MoE-4x7b-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jsfs11/MixtureofMerges-MoE-4x7b-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jsfs11__MixtureofMerges-MoE-4x7b-v4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jsfs11/MixtureofMerges-MoE-4x7b-v4
2b98406f20a874184dbffb5ed24e1f4b5063ec4b
20.022361
apache-2.0
4
24.154
true
true
false
false
2.767656
0.402994
40.299406
0.516901
32.217998
0.063444
6.344411
0.286074
4.809843
0.438552
13.885677
0.303191
22.576832
true
false
2024-02-11
2024-08-05
1
jsfs11/MixtureofMerges-MoE-4x7b-v4 (Merge)
jsfs11_MixtureofMerges-MoE-4x7b-v5_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/jsfs11/MixtureofMerges-MoE-4x7b-v5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">jsfs11/MixtureofMerges-MoE-4x7b-v5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/jsfs11__MixtureofMerges-MoE-4x7b-v5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
jsfs11/MixtureofMerges-MoE-4x7b-v5
c1b5ce7144b966062df7627d2482a59e0df3757c
20.434941
apache-2.0
1
24.154
true
true
false
false
2.862545
0.41993
41.993023
0.519848
32.826724
0.075529
7.55287
0.284396
4.58613
0.43049
12.344531
0.309757
23.306368
true
false
2024-02-25
2024-08-05
1
jsfs11/MixtureofMerges-MoE-4x7b-v5 (Merge)
kaist-ai_janus-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/kaist-ai/janus-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kaist-ai/janus-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kaist-ai__janus-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kaist-ai/janus-7b
f19c614ae7c81db06af1655d297c67afa99ad286
17.616999
apache-2.0
8
7.242
true
false
false
false
1.213207
0.377515
37.751499
0.469367
25.74987
0.040785
4.07855
0.272651
3.020134
0.440104
14.279688
0.2874
20.822252
false
false
2024-04-04
2024-10-09
1
alpindale/Mistral-7B-v0.2-hf
kaist-ai_janus-dpo-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/kaist-ai/janus-dpo-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kaist-ai/janus-dpo-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kaist-ai__janus-dpo-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kaist-ai/janus-dpo-7b
a414396b6d03fba75d12ccf7d8391186b4b639ce
18.531649
apache-2.0
3
7.242
true
false
false
false
1.252857
0.400271
40.027128
0.477258
27.090902
0.041541
4.154079
0.281879
4.250559
0.43874
13.709115
0.297623
21.958112
false
false
2024-04-25
2024-10-09
1
Removed
kaist-ai_janus-rm-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LLMForSequenceRegression
<a target="_blank" href="https://huggingface.co/kaist-ai/janus-rm-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kaist-ai/janus-rm-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kaist-ai__janus-rm-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kaist-ai/janus-rm-7b
ffdbcc353ad4034fdfa68a767d265920d5f3e71c
4.775599
apache-2.0
4
7.111
true
false
false
false
1.078221
0.177805
17.780489
0.305647
3.277781
0
0
0.251678
0.223714
0.388292
5.969792
0.112616
1.401817
false
false
2024-05-09
2024-10-09
0
kaist-ai/janus-rm-7b
kaist-ai_mistral-orpo-capybara-7k_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/kaist-ai/mistral-orpo-capybara-7k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kaist-ai/mistral-orpo-capybara-7k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kaist-ai__mistral-orpo-capybara-7k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kaist-ai/mistral-orpo-capybara-7k
24c1172060658a1923c9b454796857e2cc59fbeb
19.220895
mit
26
7.242
true
false
false
true
1.321487
0.536734
53.673364
0.4489
23.434359
0.039275
3.927492
0.286074
4.809843
0.396354
7.577604
0.297124
21.902704
false
false
2024-03-23
2024-10-09
1
kaist-ai/mistral-orpo-capybara-7k (Merge)
kavonalds_BunderMaxx-0710_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/kavonalds/BunderMaxx-0710" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kavonalds/BunderMaxx-0710</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kavonalds__BunderMaxx-0710-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kavonalds/BunderMaxx-0710
926bc1c4eff036fda0a56e4650366bbd35ae64ec
16.371169
0
1.236
false
false
false
true
0.34688
0.328256
32.825569
0.665076
51.577548
0.067976
6.797583
0.260906
1.454139
0.339333
2.083333
0.1314
3.488845
false
false
2025-02-26
0
Removed
kavonalds_BunderMaxx-0710_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/kavonalds/BunderMaxx-0710" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kavonalds/BunderMaxx-0710</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kavonalds__BunderMaxx-0710-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kavonalds/BunderMaxx-0710
926bc1c4eff036fda0a56e4650366bbd35ae64ec
13.9546
0
1.236
false
false
false
false
0.364822
0.270079
27.007895
0.556586
36.10978
0.067976
6.797583
0.280201
4.026846
0.368198
4.791406
0.144947
4.99409
false
false
2025-02-26
0
Removed
kavonalds_BunderMaxx-1010_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/kavonalds/BunderMaxx-1010" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kavonalds/BunderMaxx-1010</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kavonalds__BunderMaxx-1010-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kavonalds/BunderMaxx-1010
84903278a74a863ae317bb3f4678a1440c67187a
17.426948
0
1.236
false
false
false
true
0.350289
0.298056
29.805583
0.701984
56.822652
0.104985
10.498489
0.260906
1.454139
0.348448
3.489323
0.122424
2.491504
false
false
2025-02-26
0
Removed
kavonalds_Lancer-1-1b-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/kavonalds/Lancer-1-1b-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kavonalds/Lancer-1-1b-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kavonalds__Lancer-1-1b-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kavonalds/Lancer-1-1b-Instruct
1fcd28f30158c8990c62a13f7750de1a98f229e2
12.30611
0
1.236
false
false
false
true
0.366585
0.554594
55.459403
0.325327
6.035794
0.039275
3.927492
0.261745
1.565996
0.314438
0.533333
0.156832
6.314642
false
false
2025-02-16
0
Removed
kayfour_T3Q-Qwen2.5-7B-it-KOR-Safe_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/kayfour/T3Q-Qwen2.5-7B-it-KOR-Safe" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kayfour/T3Q-Qwen2.5-7B-it-KOR-Safe</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kayfour__T3Q-Qwen2.5-7B-it-KOR-Safe-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kayfour/T3Q-Qwen2.5-7B-it-KOR-Safe
7a737c91827de2c67a632090730a03dae0921e3a
32.418099
apache-2.0
1
7.616
true
false
false
false
0.667271
0.60815
60.814971
0.554994
36.45158
0.376133
37.613293
0.321309
9.50783
0.427729
11.632813
0.446393
38.488106
false
false
2024-11-11
2025-02-20
0
kayfour/T3Q-Qwen2.5-7B-it-KOR-Safe
keeeeenw_MicroLlama_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/keeeeenw/MicroLlama" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">keeeeenw/MicroLlama</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/keeeeenw__MicroLlama-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
keeeeenw/MicroLlama
8d5874ca07b86ea1ea2e71eea96212278506ba65
5.266088
apache-2.0
45
0.305
true
false
false
false
0.371536
0.198538
19.853766
0.300731
2.831364
0.011329
1.132931
0.260906
1.454139
0.369812
4.793229
0.11378
1.531102
false
false
2024-03-29
2024-09-15
0
keeeeenw/MicroLlama
kekmodel_StopCarbon-10.7B-v5_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/kekmodel/StopCarbon-10.7B-v5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kekmodel/StopCarbon-10.7B-v5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kekmodel__StopCarbon-10.7B-v5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kekmodel/StopCarbon-10.7B-v5
7d59819dce2439f6c83b4f5c21a68aa882ff5ac9
20.932992
cc-by-nc-sa-4.0
2
10.732
true
false
false
true
1.490587
0.472837
47.283652
0.517772
31.993222
0.055891
5.589124
0.306208
7.494407
0.401938
9.275521
0.315658
23.962027
true
false
2023-12-30
2024-07-25
0
kekmodel/StopCarbon-10.7B-v5
kevin009_llamaRAGdrama_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/kevin009/llamaRAGdrama" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kevin009/llamaRAGdrama</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kevin009__llamaRAGdrama-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kevin009/llamaRAGdrama
8c103ca8fa6dd9a8d3dab81b319408095e9a1ad8
13.348717
apache-2.0
7
7.242
true
false
false
true
1.279278
0.259837
25.983723
0.400739
16.637814
0.043051
4.305136
0.264262
1.901566
0.431573
12.113281
0.272357
19.150783
false
false
2024-02-04
2024-06-26
0
kevin009/llamaRAGdrama
khoantap_cheap-moe-merge_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2MoeForCausalLM
<a target="_blank" href="https://huggingface.co/khoantap/cheap-moe-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">khoantap/cheap-moe-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/khoantap__cheap-moe-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
khoantap/cheap-moe-merge
1bb9f1fe81fafd43cf8dbbeae1eae43da665d3f4
21.655015
0
19.305
false
true
false
false
4.786928
0.455701
45.570087
0.513117
29.799724
0.092145
9.214502
0.295302
6.040268
0.410302
13.321094
0.33386
25.984412
false
false
2025-01-20
2025-01-21
0
khoantap/cheap-moe-merge
khoantap_llama-3-8b-stock-merge_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/khoantap/llama-3-8b-stock-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">khoantap/llama-3-8b-stock-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/khoantap__llama-3-8b-stock-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
khoantap/llama-3-8b-stock-merge
0ba787be8f982dab1415aaeaf66afb58579574b8
23.933563
0
8.03
false
false
false
false
1.483799
0.48118
48.117994
0.516226
30.760624
0.161631
16.163142
0.317953
9.060403
0.394583
8.389583
0.379987
31.109634
false
false
2025-01-17
2025-01-21
0
khoantap/llama-3-8b-stock-merge
khoantap_llama-breadcrumbs-ties-merge_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/khoantap/llama-breadcrumbs-ties-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">khoantap/llama-breadcrumbs-ties-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/khoantap__llama-breadcrumbs-ties-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
khoantap/llama-breadcrumbs-ties-merge
57e28eb2dc698f1d6f807145efe61f4be0e6bfe4
17.90628
0
8.03
false
false
false
false
1.49757
0.220519
22.051933
0.541593
33.780783
0.112538
11.253776
0.26594
2.12528
0.443448
14.097656
0.317154
24.128251
false
false
2025-01-17
2025-01-21
0
khoantap/llama-breadcrumbs-ties-merge
khoantap_llama-evolve-ties-best-merge_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/khoantap/llama-evolve-ties-best-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">khoantap/llama-evolve-ties-best-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/khoantap__llama-evolve-ties-best-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
khoantap/llama-evolve-ties-best-merge
8616af5c46c02605a57e46dcfe5d960f767541d9
27.644114
0
8.03
false
false
false
false
1.461923
0.674395
67.439505
0.541357
34.848653
0.156344
15.634441
0.317114
8.948546
0.394552
7.21901
0.385971
31.774527
false
false
2025-01-20
2025-01-21
0
khoantap/llama-evolve-ties-best-merge
khoantap_llama-linear-0.5-0.5-1-merge_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/khoantap/llama-linear-0.5-0.5-1-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">khoantap/llama-linear-0.5-0.5-1-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/khoantap__llama-linear-0.5-0.5-1-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
khoantap/llama-linear-0.5-0.5-1-merge
a131718ecb2fa7da4a873fc2cfa4bae180bf9c7a
25.906963
0
8.03
false
false
false
false
1.451876
0.48123
48.12298
0.564301
38.205852
0.205438
20.543807
0.307047
7.606264
0.414271
9.483854
0.383311
31.479019
false
false
2025-01-17
2025-01-21
0
khoantap/llama-linear-0.5-0.5-1-merge
khoantap_llama-linear-0.5-1-0.5-merge_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/khoantap/llama-linear-0.5-1-0.5-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">khoantap/llama-linear-0.5-1-0.5-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/khoantap__llama-linear-0.5-1-0.5-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
khoantap/llama-linear-0.5-1-0.5-merge
ac380e0915db101279ea64defeff6ef45fb653e5
25.550041
0
8.03
false
false
false
false
1.397081
0.503162
50.316161
0.595077
42.29147
0.148036
14.803625
0.293624
5.816555
0.417188
10.181771
0.369016
29.890662
false
false
2025-01-17
2025-01-21
0
khoantap/llama-linear-0.5-1-0.5-merge
khoantap_llama-linear-1-0.5-0.5-merge_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/khoantap/llama-linear-1-0.5-0.5-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">khoantap/llama-linear-1-0.5-0.5-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/khoantap__llama-linear-1-0.5-0.5-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
khoantap/llama-linear-1-0.5-0.5-merge
b71a096c0541759b219850134d81e8bbb3de8db4
25.278191
0
8.03
false
false
false
false
1.440188
0.451454
45.145436
0.552602
36.494371
0.247734
24.773414
0.292785
5.704698
0.41176
10.270052
0.363531
29.281176
false
false
2025-01-17
2025-01-21
0
khoantap/llama-linear-1-0.5-0.5-merge
khoantap_llama-slerp-merge_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/khoantap/llama-slerp-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">khoantap/llama-slerp-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/khoantap__llama-slerp-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
khoantap/llama-slerp-merge
011e6ceb076efa340a3b88f525de1523769d760d
24.169846
0
8.03
false
false
false
false
1.448099
0.497991
49.799089
0.578278
39.915314
0.083082
8.308157
0.302852
7.04698
0.405312
10.197396
0.367769
29.752142
false
false
2025-01-17
2025-01-21
0
khoantap/llama-slerp-merge
khoantap_moe-out-merge_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2MoeForCausalLM
<a target="_blank" href="https://huggingface.co/khoantap/moe-out-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">khoantap/moe-out-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/khoantap__moe-out-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
khoantap/moe-out-merge
6ded40192b68587aeb2eb05c8f1d4f09a4cfa3f5
21.177516
0
19.305
false
true
false
false
4.691128
0.45048
45.048028
0.515117
30.041206
0.0929
9.29003
0.288591
5.145414
0.406302
11.454427
0.334774
26.085993
false
false
2025-01-20
2025-01-21
0
khoantap/moe-out-merge
khulaifi95_Llama-3.1-8B-Reason-Blend-888k_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/khulaifi95/Llama-3.1-8B-Reason-Blend-888k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">khulaifi95/Llama-3.1-8B-Reason-Blend-888k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/khulaifi95__Llama-3.1-8B-Reason-Blend-888k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
khulaifi95/Llama-3.1-8B-Reason-Blend-888k
bf30101a0cbfca2265528aa9ffc4397f7581df7e
21.101959
other
3
8.03
true
false
false
true
1.377757
0.58317
58.317043
0.478953
26.547568
0.115559
11.555891
0.279362
3.914989
0.337938
2.942187
0.310007
23.334072
false
false
2024-12-24
2024-12-27
1
khulaifi95/Llama-3.1-8B-Reason-Blend-888k (Merge)
kms7530_chemeng_llama-3-8b-Instruct-bnb-4bit_24_1_100_1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/kms7530/chemeng_llama-3-8b-Instruct-bnb-4bit_24_1_100_1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kms7530/chemeng_llama-3-8b-Instruct-bnb-4bit_24_1_100_1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kms7530__chemeng_llama-3-8b-Instruct-bnb-4bit_24_1_100_1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kms7530/chemeng_llama-3-8b-Instruct-bnb-4bit_24_1_100_1
f296897830363557c84cc4a942c2cd1f91818ae4
17.996717
apache-2.0
0
9.3
true
false
false
true
4.599955
0.545501
54.550149
0.428904
19.07919
0.061934
6.193353
0.270134
2.684564
0.382062
5.491146
0.279837
19.9819
false
false
2024-10-10
2024-10-14
2
Removed
kms7530_chemeng_phi-3-mini-4k-instruct-bnb-4bit_16_4_100_1_nonmath_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/kms7530/chemeng_phi-3-mini-4k-instruct-bnb-4bit_16_4_100_1_nonmath" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kms7530/chemeng_phi-3-mini-4k-instruct-bnb-4bit_16_4_100_1_nonmath</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kms7530__chemeng_phi-3-mini-4k-instruct-bnb-4bit_16_4_100_1_nonmath-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kms7530/chemeng_phi-3-mini-4k-instruct-bnb-4bit_16_4_100_1_nonmath
81453e5718775630581ab9950e6c0ccf0d7a4177
22.1103
apache-2.0
1
4.132
true
false
false
true
2.713969
0.486325
48.632517
0.498718
29.259631
0.108006
10.800604
0.310403
8.053691
0.398281
8.351823
0.348072
27.563534
false
false
2024-11-23
2024-11-25
1
unsloth/Phi-3-mini-4k-instruct-bnb-4bit
kms7530_chemeng_qwen-math-7b_24_1_100_1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/kms7530/chemeng_qwen-math-7b_24_1_100_1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kms7530/chemeng_qwen-math-7b_24_1_100_1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kms7530__chemeng_qwen-math-7b_24_1_100_1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kms7530/chemeng_qwen-math-7b_24_1_100_1
b3c1a1875fe4679e8c402b2bde02ae6c1127eb63
11.664856
apache-2.0
0
8.911
true
false
false
true
6.794124
0.211052
21.105223
0.357801
10.326751
0.22432
22.432024
0.244128
0
0.368698
3.253906
0.215841
12.871232
false
false
2024-10-10
2024-10-14
4
Qwen/Qwen2.5-7B
kms7530_chemeng_qwen-math-7b_24_1_100_1_nonmath_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/kms7530/chemeng_qwen-math-7b_24_1_100_1_nonmath" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kms7530/chemeng_qwen-math-7b_24_1_100_1_nonmath</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kms7530__chemeng_qwen-math-7b_24_1_100_1_nonmath-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kms7530/chemeng_qwen-math-7b_24_1_100_1_nonmath
ef9926d75ab1d54532f6a30dd5e760355eb9aa4d
16.98209
apache-2.0
0
15.231
true
false
false
true
2.558062
0.258363
25.836336
0.389286
14.135345
0.309668
30.966767
0.290268
5.369128
0.408698
9.453906
0.24518
16.131058
false
false
2024-11-21
2024-11-22
4
Qwen/Qwen2.5-7B
kno10_ende-chat-0.0.5_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/kno10/ende-chat-0.0.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kno10/ende-chat-0.0.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kno10__ende-chat-0.0.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kno10/ende-chat-0.0.5
fff913e8ce204bab72b02582b663db669cb61412
10.850085
apache-2.0
0
7.891
true
false
false
true
2.964035
0.340446
34.044557
0.360437
11.125831
0.020393
2.039275
0.265101
2.013423
0.393844
7.097135
0.179023
8.78029
false
false
2024-06-27
2024-06-27
0
kno10/ende-chat-0.0.5
kno10_ende-chat-0.0.7_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/kno10/ende-chat-0.0.7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kno10/ende-chat-0.0.7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kno10__ende-chat-0.0.7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kno10/ende-chat-0.0.7
1d45f51e5a3387378cea1036b0c65f2893466dd6
13.371914
apache-2.0
0
7.891
true
false
false
true
1.916732
0.440063
44.006348
0.379187
13.578949
0.017372
1.73716
0.28104
4.138702
0.386125
6.032292
0.196642
10.738032
false
false
2024-07-30
2024-07-30
0
kno10/ende-chat-0.0.7
kyutai_helium-1-preview-2b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
HeliumForCausalLM
<a target="_blank" href="https://huggingface.co/kyutai/helium-1-preview-2b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kyutai/helium-1-preview-2b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kyutai__helium-1-preview-2b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kyutai/helium-1-preview-2b
dab850c85de673482dbf28b873064a274583e3b3
9.329144
cc-by-4.0
140
2.173
true
false
false
false
0.688162
0.261361
26.136097
0.363816
10.94514
0.013595
1.359517
0.278523
3.803132
0.354958
4.036458
0.187251
9.694518
false
true
2025-01-13
2025-01-14
0
kyutai/helium-1-preview-2b
kz919_QwQ-0.5B-Distilled-SFT_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/kz919/QwQ-0.5B-Distilled-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">kz919/QwQ-0.5B-Distilled-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/kz919__QwQ-0.5B-Distilled-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
kz919/QwQ-0.5B-Distilled-SFT
06b5127157cad87614a851f7b7b2ec2a9b8bd49d
9.089107
apache-2.0
23
0.494
true
false
false
true
1.019595
0.307673
30.767253
0.325629
7.277629
0.074018
7.401813
0.260906
1.454139
0.340854
1.106771
0.158743
6.527039
false
false
2025-01-05
2025-01-10
1
kz919/QwQ-0.5B-Distilled-SFT (Merge)
ladydaina_ECE-FDF_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ladydaina/ECE-FDF" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ladydaina/ECE-FDF</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ladydaina__ECE-FDF-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ladydaina/ECE-FDF
81e709d727e9ba5cf8707fe0c5c08e688a4cc6bd
20.042365
0
7.242
false
false
false
false
0.892448
0.372844
37.284405
0.515018
32.250998
0.081571
8.1571
0.282718
4.362416
0.450396
15.899479
0.300698
22.299793
false
false
2024-11-14
2024-11-14
1
ladydaina/ECE-FDF (Merge)
laislemke_LLaMA-2-vicuna-7b-slerp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/laislemke/LLaMA-2-vicuna-7b-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">laislemke/LLaMA-2-vicuna-7b-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/laislemke__LLaMA-2-vicuna-7b-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
laislemke/LLaMA-2-vicuna-7b-slerp
84a64f0ac8ff7db632a9d012fd5f4dcdf1eff950
7.694402
llama2
0
6.738
true
false
false
true
1.194151
0.29321
29.320979
0.298622
2.598264
0.011329
1.132931
0.27349
3.131991
0.383302
6.179427
0.134225
3.802822
true
false
2024-07-03
2024-07-03
1
laislemke/LLaMA-2-vicuna-7b-slerp (Merge)
lalainy_ECE-PRYMMAL-0.5B-FT-V5-MUSR_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/lalainy/ECE-PRYMMAL-0.5B-FT-V5-MUSR" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lalainy/ECE-PRYMMAL-0.5B-FT-V5-MUSR</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lalainy__ECE-PRYMMAL-0.5B-FT-V5-MUSR-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lalainy/ECE-PRYMMAL-0.5B-FT-V5-MUSR
bf80bf3d14a79b5dcb322b97b6dbaf10e316a3ee
7.057838
apache-2.0
0
0.494
true
false
false
false
1.16588
0.213775
21.377501
0.326944
6.485922
0.045317
4.531722
0.274329
3.243848
0.32625
0.78125
0.153341
5.926788
false
false
2024-10-22
2024-10-22
0
lalainy/ECE-PRYMMAL-0.5B-FT-V5-MUSR
lalainy_ECE-PRYMMAL-0.5B-SLERP-V4_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/lalainy/ECE-PRYMMAL-0.5B-SLERP-V4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lalainy/ECE-PRYMMAL-0.5B-SLERP-V4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lalainy__ECE-PRYMMAL-0.5B-SLERP-V4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lalainy/ECE-PRYMMAL-0.5B-SLERP-V4
3a34c33dba0f02cd8c5172f45b6f6510cad1563d
4.380943
apache-2.0
0
0.494
true
false
false
false
1.890931
0.156397
15.639725
0.289431
2.09608
0
0
0.262584
1.677852
0.378927
4.999219
0.116855
1.872784
false
false
2024-10-22
2024-10-22
0
lalainy/ECE-PRYMMAL-0.5B-SLERP-V4
lalainy_ECE-PRYMMAL-YL-0.5B-SLERP-BIS-V1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/lalainy/ECE-PRYMMAL-YL-0.5B-SLERP-BIS-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lalainy/ECE-PRYMMAL-YL-0.5B-SLERP-BIS-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lalainy__ECE-PRYMMAL-YL-0.5B-SLERP-BIS-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lalainy/ECE-PRYMMAL-YL-0.5B-SLERP-BIS-V1
7865b6f386969b831e9c1754914463154fecbda2
3.610722
apache-2.0
0
0.494
true
false
false
false
1.050547
0.143708
14.370758
0.303195
2.929449
0.000755
0.075529
0.234899
0
0.364604
2.942187
0.112118
1.34641
false
false
2024-11-09
2024-11-12
0
lalainy/ECE-PRYMMAL-YL-0.5B-SLERP-BIS-V1
lalainy_ECE-PRYMMAL-YL-1B-SLERP-V3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/lalainy/ECE-PRYMMAL-YL-1B-SLERP-V3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lalainy/ECE-PRYMMAL-YL-1B-SLERP-V3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lalainy__ECE-PRYMMAL-YL-1B-SLERP-V3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lalainy/ECE-PRYMMAL-YL-1B-SLERP-V3
eef4293be744aef0524f00a7657e915a6601a459
16.447901
apache-2.0
0
1.544
true
false
false
false
1.186214
0.325009
32.500875
0.422455
18.228655
0.097432
9.743202
0.294463
5.928412
0.421281
10.826823
0.293135
21.459441
false
false
2024-11-12
2024-11-12
0
lalainy/ECE-PRYMMAL-YL-1B-SLERP-V3
lalainy_ECE-PRYMMAL-YL-1B-SLERP-V4_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/lalainy/ECE-PRYMMAL-YL-1B-SLERP-V4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lalainy/ECE-PRYMMAL-YL-1B-SLERP-V4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lalainy__ECE-PRYMMAL-YL-1B-SLERP-V4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lalainy/ECE-PRYMMAL-YL-1B-SLERP-V4
dfa5e42b6f4f83cacc3b9e7d0ff05fec7f941835
16.438385
apache-2.0
0
1.544
true
false
false
false
1.223743
0.332353
33.23526
0.417074
17.411752
0.100453
10.045317
0.286074
4.809843
0.430615
12.09349
0.289312
21.034648
false
false
2024-11-12
2024-11-12
0
lalainy/ECE-PRYMMAL-YL-1B-SLERP-V4
lalainy_ECE-PRYMMAL-YL-6B-SLERP-V1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lalainy/ECE-PRYMMAL-YL-6B-SLERP-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lalainy/ECE-PRYMMAL-YL-6B-SLERP-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lalainy__ECE-PRYMMAL-YL-6B-SLERP-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lalainy/ECE-PRYMMAL-YL-6B-SLERP-V1
56789ff5fcc863460fce652ebe6ed6bb5a4bd30c
20.037042
apache-2.0
0
6.061
true
false
false
false
1.003161
0.326407
32.640727
0.462937
24.515259
0.126888
12.688822
0.288591
5.145414
0.486396
20.632813
0.321393
24.599217
false
false
2024-11-08
2024-11-08
0
lalainy/ECE-PRYMMAL-YL-6B-SLERP-V1
lalainy_ECE-PRYMMAL-YL-6B-SLERP-V2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lalainy/ECE-PRYMMAL-YL-6B-SLERP-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lalainy/ECE-PRYMMAL-YL-6B-SLERP-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lalainy__ECE-PRYMMAL-YL-6B-SLERP-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lalainy/ECE-PRYMMAL-YL-6B-SLERP-V2
18d282d0206ae8f878a9cfa80ce4eaf042056569
20.011646
apache-2.0
0
6.061
true
false
false
false
0.996262
0.324884
32.488353
0.462937
24.515259
0.126888
12.688822
0.288591
5.145414
0.486396
20.632813
0.321393
24.599217
false
false
2024-11-09
2024-11-09
0
lalainy/ECE-PRYMMAL-YL-6B-SLERP-V2
langgptai_Qwen-las-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/langgptai/Qwen-las-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">langgptai/Qwen-las-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/langgptai__Qwen-las-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
langgptai/Qwen-las-v0.1
a7a4d4945d28bac955554c9abd2f74a71ebbf22f
11.633178
other
0
7.901
true
false
false
true
3.596193
0.330104
33.010412
0.389255
14.69864
0.037009
3.700906
0.246644
0
0.370094
3.661719
0.232547
14.727394
false
false
2024-05-26
2024-06-27
1
Qwen/Qwen1.5-4B-Chat
langgptai_qwen1.5-7b-chat-sa-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/langgptai/qwen1.5-7b-chat-sa-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">langgptai/qwen1.5-7b-chat-sa-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/langgptai__qwen1.5-7b-chat-sa-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
langgptai/qwen1.5-7b-chat-sa-v0.1
5f4f5e69ac7f1d508f8369e977de208b4803444b
16.580171
other
0
15.443
true
false
false
true
1.464321
0.426774
42.677429
0.432527
20.302342
0.030211
3.021148
0.312081
8.277405
0.355146
3.059896
0.299285
22.142804
false
false
2024-05-30
2024-06-27
1
Qwen/Qwen1.5-7B-Chat
lars1234_Mistral-Small-24B-Instruct-2501-writer_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/lars1234/Mistral-Small-24B-Instruct-2501-writer" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lars1234/Mistral-Small-24B-Instruct-2501-writer</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lars1234__Mistral-Small-24B-Instruct-2501-writer-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lars1234/Mistral-Small-24B-Instruct-2501-writer
45850ca22637c0f5eaa2aa1fd22cf6d8aa619d47
39.85579
apache-2.0
12
23.572
true
false
false
false
1.296353
0.656535
65.653466
0.673316
52.78404
0.35574
35.574018
0.389262
18.568233
0.464531
17.133073
0.544797
49.421912
false
false
2025-03-06
2025-03-07
1
lars1234/Mistral-Small-24B-Instruct-2501-writer (Merge)
leafspark_Llama-3.1-8B-MultiReflection-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/leafspark/Llama-3.1-8B-MultiReflection-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">leafspark/Llama-3.1-8B-MultiReflection-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/leafspark__Llama-3.1-8B-MultiReflection-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
leafspark/Llama-3.1-8B-MultiReflection-Instruct
b748441154efdbd7690d773b0194197bfc136ed0
26.878347
llama3.1
5
8.03
true
false
false
true
1.696894
0.712538
71.253829
0.500909
28.448045
0.170695
17.069486
0.292785
5.704698
0.368198
8.52474
0.372424
30.269282
false
false
2024-09-15
2024-09-15
1
leafspark/Llama-3.1-8B-MultiReflection-Instruct (Merge)
lemon07r_Gemma-2-Ataraxy-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-9B
fb22193268c7a6c3b4598255999ce2de3af8c256
23.711508
gemma
75
10.159
true
false
false
false
5.821376
0.300877
30.087723
0.59313
42.031991
0.085347
8.534743
0.334732
11.297539
0.442427
14.470052
0.422623
35.847001
true
false
2024-08-14
2024-08-27
1
lemon07r/Gemma-2-Ataraxy-9B (Merge)
lemon07r_Gemma-2-Ataraxy-Advanced-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-Advanced-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-Advanced-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-Advanced-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-Advanced-9B
960654f5780f0b458367a6b591ad8440892c2aad
28.34484
4
10.159
false
false
false
false
6.454554
0.551596
55.159643
0.588907
41.161438
0.197885
19.78852
0.33557
11.409396
0.376073
6.509115
0.424368
36.040928
false
false
2024-09-30
2024-09-30
1
lemon07r/Gemma-2-Ataraxy-Advanced-9B (Merge)
lemon07r_Gemma-2-Ataraxy-Remix-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-Remix-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-Remix-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-Remix-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-Remix-9B
f917a9be9f86d58fe122d58ba84cf4b08e4a975e
32.358348
5
10.159
false
false
false
false
4.314882
0.708342
70.834164
0.589202
41.592313
0.201662
20.166163
0.338926
11.856823
0.437188
13.715104
0.42387
35.98552
false
false
2024-09-30
2024-09-30
1
lemon07r/Gemma-2-Ataraxy-Remix-9B (Merge)
lemon07r_Gemma-2-Ataraxy-v2-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-v2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-v2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-v2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-v2-9B
77aca48ac25eb2cbe8c0751a4ef77e5face34d80
20.432642
17
10.159
false
false
false
false
5.992484
0.213624
21.362429
0.576584
39.796854
0.084592
8.459215
0.342282
12.304251
0.348385
4.88151
0.422124
35.791593
false
false
2024-09-28
2024-09-28
1
lemon07r/Gemma-2-Ataraxy-v2-9B (Merge)
lemon07r_Gemma-2-Ataraxy-v2a-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-v2a-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-v2a-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-v2a-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-v2a-9B
899fb093d80569fc919f53217e3acf031dde89a5
16.038999
2
10.159
false
false
false
false
5.962792
0.159469
15.94691
0.518249
31.198528
0.061178
6.117825
0.339765
11.96868
0.316479
3.059896
0.351479
27.942154
false
false
2024-09-29
2024-09-29
1
lemon07r/Gemma-2-Ataraxy-v2a-9B (Merge)
lemon07r_Gemma-2-Ataraxy-v2f-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-v2f-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-v2f-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-v2f-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-v2f-9B
44da9d6a9bc7be5a9af24fb0951047849d5f717d
20.704514
2
10.159
false
false
false
false
6.797191
0.379114
37.911408
0.519285
31.421336
0.116314
11.63142
0.338926
11.856823
0.323146
3.593229
0.350316
27.812869
false
false
2024-09-30
2024-09-30
1
lemon07r/Gemma-2-Ataraxy-v2f-9B (Merge)
lemon07r_Gemma-2-Ataraxy-v3-Advanced-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-v3-Advanced-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-v3-Advanced-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-v3-Advanced-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-v3-Advanced-9B
318afe2b44a150780e44483a0f90a499e81f946f
31.430554
3
10.159
false
false
false
false
5.575992
0.660182
66.018165
0.593515
42.210472
0.187311
18.731118
0.336409
11.521253
0.444969
14.58776
0.419631
35.514554
false
false
2024-10-09
2024-10-09
1
lemon07r/Gemma-2-Ataraxy-v3-Advanced-9B (Merge)
lemon07r_Gemma-2-Ataraxy-v3b-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-v3b-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-v3b-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-v3b-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-v3b-9B
de8bbacddabf22dad89658d3b3d358b3eccbd59c
32.188535
2
9.242
false
false
false
false
4.607456
0.680914
68.091442
0.59077
41.623985
0.215257
21.52568
0.333054
11.073826
0.448875
15.209375
0.420462
35.6069
false
false
2024-10-08
2024-10-08
1
lemon07r/Gemma-2-Ataraxy-v3b-9B (Merge)
lemon07r_Gemma-2-Ataraxy-v3i-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-v3i-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-v3i-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-v3i-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-v3i-9B
8bd1ce81b6f42ebeebd9957b605c7313eedbe0a8
23.824039
3
9.242
false
false
false
false
6.95102
0.420305
42.030479
0.562575
38.238825
0.153323
15.332326
0.32802
10.402685
0.318062
1.757812
0.416639
35.182107
false
false
2024-10-06
2024-10-06
1
lemon07r/Gemma-2-Ataraxy-v3i-9B (Merge)
lemon07r_Gemma-2-Ataraxy-v3j-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-v3j-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-v3j-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-v3j-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-v3j-9B
7ad4a1bf604f37bd82f3470dbc24870896d7287d
23.987246
2
9.242
false
false
false
false
6.827645
0.416933
41.693263
0.563229
38.166569
0.169184
16.918429
0.32802
10.402685
0.318031
1.920573
0.413398
34.821956
false
false
2024-10-09
2024-10-09
1
lemon07r/Gemma-2-Ataraxy-v3j-9B (Merge)
lemon07r_Gemma-2-Ataraxy-v4-Advanced-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-v4-Advanced-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-v4-Advanced-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-v4-Advanced-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-v4-Advanced-9B
bc9edb78753fc60a22268cd91e93e43dd9fbc648
33.402774
6
10.159
false
false
false
false
4.659128
0.701547
70.154745
0.602363
43.181897
0.215257
21.52568
0.338926
11.856823
0.458052
16.289844
0.436669
37.407654
false
false
2024-10-13
2024-10-14
1
lemon07r/Gemma-2-Ataraxy-v4-Advanced-9B (Merge)
lemon07r_Gemma-2-Ataraxy-v4a-Advanced-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-v4a-Advanced-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-v4a-Advanced-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-v4a-Advanced-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-v4a-Advanced-9B
78dca140ec1b704233c932706fc9640404433cc5
33.285861
4
10.159
false
false
false
false
4.579236
0.713512
71.351237
0.598839
42.737517
0.21148
21.148036
0.34396
12.527964
0.448906
15.179948
0.430934
36.770464
false
false
2024-10-14
2024-10-14
1
lemon07r/Gemma-2-Ataraxy-v4a-Advanced-9B (Merge)
lemon07r_Gemma-2-Ataraxy-v4b-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-v4b-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-v4b-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-v4b-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-v4b-9B
70dc6ddfaede76ff01584922fca53ba90837cd52
33.468349
2
10.159
false
false
false
false
4.850034
0.687834
68.783384
0.603916
43.442739
0.233384
23.338369
0.340604
12.080537
0.455479
15.868229
0.435672
37.296838
false
false
2024-10-16
2024-10-22
1
lemon07r/Gemma-2-Ataraxy-v4b-9B (Merge)
lemon07r_Gemma-2-Ataraxy-v4c-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-v4c-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-v4c-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-v4c-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-v4c-9B
26f2619a432266a5f73c135804b1aa34f00ec689
33.406963
4
10.159
false
false
false
false
5.530475
0.694528
69.45283
0.608432
44.125367
0.226586
22.65861
0.333893
11.185682
0.452781
15.297656
0.439495
37.721631
false
false
2024-10-16
2024-10-16
1
lemon07r/Gemma-2-Ataraxy-v4c-9B (Merge)
lemon07r_Gemma-2-Ataraxy-v4d-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Gemma-2-Ataraxy-v4d-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Gemma-2-Ataraxy-v4d-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Gemma-2-Ataraxy-v4d-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Gemma-2-Ataraxy-v4d-9B
24f9ad78e42c92df5277b3aea4deb4083a8625d9
34.242386
gemma
15
10.159
true
false
false
false
5.370555
0.725003
72.500299
0.605416
43.595239
0.233384
23.338369
0.347315
12.975391
0.454146
15.868229
0.434591
37.176788
true
false
2024-10-25
2024-10-25
1
lemon07r/Gemma-2-Ataraxy-v4d-9B (Merge)
lemon07r_Llama-3-RedMagic4-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/Llama-3-RedMagic4-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/Llama-3-RedMagic4-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__Llama-3-RedMagic4-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/Llama-3-RedMagic4-8B
65ee08a0434f1903a8971640fc3cca6c8ae8590e
19.43099
llama3
0
8.03
true
false
false
true
1.597192
0.486401
48.640053
0.425605
19.475747
0.089879
8.987915
0.290268
5.369128
0.376635
4.379427
0.367603
29.733673
true
false
2024-06-19
2024-06-26
1
lemon07r/Llama-3-RedMagic4-8B (Merge)
lemon07r_llama-3-NeuralMahou-8b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lemon07r/llama-3-NeuralMahou-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lemon07r/llama-3-NeuralMahou-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lemon07r__llama-3-NeuralMahou-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lemon07r/llama-3-NeuralMahou-8b
59a0937df85f9d6d65d15dbb4a7c06b6ad8a0305
19.846074
llama3
1
8.03
true
false
false
true
1.692455
0.490097
49.009739
0.418411
18.692069
0.101964
10.196375
0.288591
5.145414
0.387271
6.142188
0.369016
29.890662
true
false
2024-05-30
2024-06-26
1
lemon07r/llama-3-NeuralMahou-8b (Merge)
lesubra_ECE-EIFFEL-3B_float16
float16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/lesubra/ECE-EIFFEL-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lesubra/ECE-EIFFEL-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lesubra__ECE-EIFFEL-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lesubra/ECE-EIFFEL-3B
aa56433ac824d245ac82d5e55ce8e589df0711ec
22.504423
apache-2.0
0
3.821
true
false
false
false
2.308814
0.346941
34.694056
0.510158
31.286439
0.121601
12.160121
0.331376
10.850112
0.436229
14.695313
0.382064
31.340499
true
false
2024-10-01
2024-10-01
0
lesubra/ECE-EIFFEL-3B
lesubra_ECE-EIFFEL-3Bv2_float16
float16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/lesubra/ECE-EIFFEL-3Bv2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lesubra/ECE-EIFFEL-3Bv2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lesubra__ECE-EIFFEL-3Bv2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lesubra/ECE-EIFFEL-3Bv2
b059d1a0d49f09d6df34d93f133d24f6641bc535
23.141091
apache-2.0
0
3.821
true
false
false
false
1.72069
0.301303
30.130277
0.542401
36.353133
0.11858
11.858006
0.33557
11.409396
0.444292
15.769792
0.399934
33.325946
true
false
2024-10-03
2024-10-03
0
lesubra/ECE-EIFFEL-3Bv2
lesubra_ECE-EIFFEL-3Bv3_float16
float16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/lesubra/ECE-EIFFEL-3Bv3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lesubra/ECE-EIFFEL-3Bv3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lesubra__ECE-EIFFEL-3Bv3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lesubra/ECE-EIFFEL-3Bv3
2cd31e58d38b96626a8a83192b5d2eec6669f5e2
25.501227
apache-2.0
0
3.821
true
false
false
false
1.434294
0.378614
37.86143
0.546945
36.464083
0.166918
16.691843
0.329698
10.626398
0.46751
18.305469
0.397523
33.058141
true
false
2024-10-07
2024-10-07
0
lesubra/ECE-EIFFEL-3Bv3
lesubra_ECE-PRYMMAL-3B-SLERP-V1_float16
float16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/lesubra/ECE-PRYMMAL-3B-SLERP-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lesubra/ECE-PRYMMAL-3B-SLERP-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lesubra__ECE-PRYMMAL-3B-SLERP-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lesubra/ECE-PRYMMAL-3B-SLERP-V1
e46f1de93f10b1a57f9175653fd29dda355a61e6
23.13536
apache-2.0
0
3.821
true
false
false
false
1.427188
0.293284
29.328404
0.534059
35.053068
0.166163
16.616314
0.317114
8.948546
0.45951
16.638802
0.390043
32.227024
true
false
2024-10-28
2024-10-28
0
lesubra/ECE-PRYMMAL-3B-SLERP-V1
lesubra_ECE-PRYMMAL-3B-SLERP-V2_float16
float16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/lesubra/ECE-PRYMMAL-3B-SLERP-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lesubra/ECE-PRYMMAL-3B-SLERP-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lesubra__ECE-PRYMMAL-3B-SLERP-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lesubra/ECE-PRYMMAL-3B-SLERP-V2
ba617ea0b1ed5497f62bf49635c30bcfb0547133
23.13536
apache-2.0
0
3.821
true
false
false
false
1.460651
0.293284
29.328404
0.534059
35.053068
0.166163
16.616314
0.317114
8.948546
0.45951
16.638802
0.390043
32.227024
true
false
2024-10-28
2024-10-28
0
lesubra/ECE-PRYMMAL-3B-SLERP-V2
lesubra_ECE-PRYMMAL-3B-SLERP_2-V1_float16
float16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/lesubra/ECE-PRYMMAL-3B-SLERP_2-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lesubra/ECE-PRYMMAL-3B-SLERP_2-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/lesubra__ECE-PRYMMAL-3B-SLERP_2-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lesubra/ECE-PRYMMAL-3B-SLERP_2-V1
354e5c732dd2fde016da1e33a018d2d2787f7805
24.961424
apache-2.0
0
3.821
true
false
false
false
1.278855
0.364901
36.490069
0.541145
35.710681
0.167674
16.767372
0.321309
9.50783
0.466146
18.068229
0.399019
33.224365
true
false
2024-11-06
2024-11-06
0
lesubra/ECE-PRYMMAL-3B-SLERP_2-V1