eval_name
stringlengths
9
97
Precision
stringclasses
5 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
3 values
Architecture
stringclasses
53 values
Model
stringlengths
355
611
fullname
stringlengths
4
89
Model sha
stringlengths
0
40
Average ⬆️
float64
27
81.3
Hub License
stringclasses
35 values
Hub ❤️
int64
0
4.88k
#Params (B)
int64
0
238
Available on the hub
bool
2 classes
Merged
bool
2 classes
MoE
bool
2 classes
Flagged
bool
1 class
date
stringlengths
0
26
Chat Template
bool
2 classes
ARC
float64
19.7
87.5
HellaSwag
float64
20.7
92.8
MMLU
float64
17.8
89.4
TruthfulQA
float64
27.9
82.3
Winogrande
float64
47.2
91.5
GSM8K
float64
0
88.2
Maintainers Choice
bool
2 classes
liminerity_m.star.7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/liminerity/m.star.7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">liminerity/m.star.7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__m.star.7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
liminerity/m.star.7b
1de2c02db0939bd92748b207d8f56dc06105712a
64.318525
apache-2.0
1
7
true
true
true
true
2024-03-24T06:32:30Z
false
60.153584
80.959968
58.27637
53.93316
78.531965
54.056103
false
liminerity_mm4-3b_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/liminerity/mm4-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">liminerity/mm4-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__mm4-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
liminerity/mm4-3b
0c43811e69b29c71d87b51b9ae94812616111293
53.224429
apache-2.0
3
3
true
true
true
true
2024-02-27T04:26:44Z
false
44.795222
70.41426
50.897461
43.199142
66.219416
43.821077
false
liminerity_phigment6-slerp_float16
float16
🤝 base merges and moerges
🤝
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/liminerity/phigment6-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">liminerity/phigment6-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__phigment6-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
liminerity/phigment6-slerp
976d97de8cb3a7af72aa6ef9583d186f6911f919
63.581293
apache-2.0
3
2
false
true
true
true
2024-02-25T05:23:36Z
false
62.627986
77.245569
58.645684
50.488221
73.875296
58.605004
false
liminerity_ultra0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/liminerity/ultra0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">liminerity/ultra0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__ultra0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
liminerity/ultra0
046f98426c1b0da043e82a110f9690268b826b5f
44.320855
apache-2.0
0
1
true
false
true
true
2024-02-16T20:13:27Z
false
41.467577
68.024298
33.366322
41.485076
65.509077
16.072782
false
lingyun1_GZDX_float16
float16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/lingyun1/GZDX" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lingyun1/GZDX</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lingyun1__GZDX" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lingyun1/GZDX
46dae3c214271bc046ae303349e4bf71e95a8836
37.965444
0
1
false
true
true
true
2024-05-07T04:39:56Z
false
35.750853
55.56662
25.192223
42.032629
58.484609
10.765732
false
lingyun1_GZDX-1.1B_float16
float16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/lingyun1/GZDX-1.1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lingyun1/GZDX-1.1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lingyun1__GZDX-1.1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lingyun1/GZDX-1.1B
3ef37f38219e3d58041e98d08e4c222c79f067f6
39.351137
0
1
false
true
true
true
2024-05-27T09:09:56Z
false
37.030717
54.670384
35.50291
40.467765
58.958169
9.476876
false
linlinlin_zephy_SFT_Hermes_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/linlinlin/zephy_SFT_Hermes" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">linlinlin/zephy_SFT_Hermes</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_linlinlin__zephy_SFT_Hermes" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
linlinlin/zephy_SFT_Hermes
d64495ffe34dbd40d5fe93639ca6f967d7c684cf
60.801388
mit
0
0
true
true
true
true
2024-03-11T09:46:51Z
false
60.324232
83.369847
63.810583
42.171719
78.058406
37.073541
false
liuchanghf_bloomz-3b-mmlu-lora_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/liuchanghf/bloomz-3b-mmlu-lora" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">liuchanghf/bloomz-3b-mmlu-lora</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_liuchanghf__bloomz-3b-mmlu-lora" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
liuchanghf/bloomz-3b-mmlu-lora
955b903e9e1c56bb729df4eaa4a909ae16d62795
37.066079
bigscience-bloom-rail-1.0
0
3
false
true
true
true
2024-04-11T08:51:27Z
false
35.836177
54.949213
34.240087
39.596728
57.77427
0
false
liuchanghf_bloomz3b-winogrande-pretrain_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/liuchanghf/bloomz3b-winogrande-pretrain" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">liuchanghf/bloomz3b-winogrande-pretrain</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_liuchanghf__bloomz3b-winogrande-pretrain" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
liuchanghf/bloomz3b-winogrande-pretrain
637fe8cf3f7b036f09ceee5d81ddc2e479d34959
36.545026
bigscience-bloom-rail-1.0
0
3
false
true
true
true
2024-04-24T02:00:30Z
false
34.044369
52.499502
30.617096
39.441471
62.667719
0
false
liuchanghf_phi2-mmlu-lora_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/liuchanghf/phi2-mmlu-lora" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">liuchanghf/phi2-mmlu-lora</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_liuchanghf__phi2-mmlu-lora" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
liuchanghf/phi2-mmlu-lora
1003b91b4ef71ab1dd4a0daac1ee1fce005b4c2f
52.546558
mit
0
0
false
true
true
true
2024-04-10T09:28:32Z
false
62.116041
74.048994
58.564926
44.188887
75.374901
0.985595
false
liuchanghf_phi2_gsm8k_lora_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/liuchanghf/phi2_gsm8k_lora" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">liuchanghf/phi2_gsm8k_lora</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_liuchanghf__phi2_gsm8k_lora" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
liuchanghf/phi2_gsm8k_lora
18ac65fcc53e2e4ec14e5e912f5a1070523df635
59.886419
mit
0
0
false
true
true
true
2024-04-22T07:05:50Z
false
59.044369
74.28799
56.699836
38.860385
75.611681
54.814253
false
liuda1_Mistral-7B-golden_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/liuda1/Mistral-7B-golden" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">liuda1/Mistral-7B-golden</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_liuda1__Mistral-7B-golden" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
liuda1/Mistral-7B-golden
bc4624485fef5a2e3fcde465eaf2191cb1df1877
52.488246
unknown
0
7
true
true
true
true
2023-12-12T06:54:51Z
false
60.750853
44.423422
59.289017
53.510035
76.637727
20.318423
false
liuda1_dm7b_sft_gpt88w_merge_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/liuda1/dm7b_sft_gpt88w_merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">liuda1/dm7b_sft_gpt88w_merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_liuda1__dm7b_sft_gpt88w_merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
liuda1/dm7b_sft_gpt88w_merge
f4f76170f6fe63e832e32d32be1eb4a1da36f402
63.183472
apache-2.0
0
7
true
true
true
true
2023-12-26T03:32:47Z
false
62.286689
82.473611
61.349487
53.328866
77.584846
42.077331
false
liuxiang886_llama2-70B-qlora-gpt4_float16
float16
?
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/liuxiang886/llama2-70B-qlora-gpt4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">liuxiang886/llama2-70B-qlora-gpt4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_liuxiang886__llama2-70B-qlora-gpt4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
liuxiang886/llama2-70B-qlora-gpt4
08115ee077953e9c01c6a40f5086def3ecf9f5f0
65.294412
0
70
false
true
true
true
2023-10-16T12:46:18Z
false
70.307167
86.387174
69.291519
54.022167
82.872928
28.885519
false
lizhuang144_llama_mirror_13b_v1.0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
Unknown
<a target="_blank" href="https://huggingface.co/lizhuang144/llama_mirror_13b_v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lizhuang144/llama_mirror_13b_v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lizhuang144__llama_mirror_13b_v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lizhuang144/llama_mirror_13b_v1.0
379cb8f080110f3418155029f534f67a79e25db4
52.45525
null
0
13
false
true
true
true
2023-09-09T10:52:17Z
false
57.593857
80.531767
47.997914
44.540363
76.637727
7.429871
false
lizhuang144_starcoder_mirror_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/lizhuang144/starcoder_mirror" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lizhuang144/starcoder_mirror</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lizhuang144__starcoder_mirror" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lizhuang144/starcoder_mirror
eb5f39bac15ccab9463001aa203e33d49f4ff7cb
35.427654
0
0
false
true
true
true
2023-09-09T10:52:17Z
false
31.313993
45.817566
29.293429
43.384654
57.221784
5.534496
false
lizpreciatior_lzlv_70b_fp16_hf_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lizpreciatior/lzlv_70b_fp16_hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lizpreciatior/lzlv_70b_fp16_hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lizpreciatior__lzlv_70b_fp16_hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lizpreciatior/lzlv_70b_fp16_hf
b366c0bb318ae592023cca894cc6b4421a607a0d
67.125959
cc-by-nc-2.0
70
68
true
true
true
true
2023-11-12T05:09:13Z
false
70.136519
87.542322
70.229281
60.489695
83.425414
30.932525
false
llama-anon_instruct-13b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/llama-anon/instruct-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llama-anon/instruct-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_llama-anon__instruct-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llama-anon/instruct-13b
142e198df473fd0cd4370b0d50be5f57e1da399b
49.519368
0
12
false
true
true
true
2023-10-16T12:48:18Z
false
56.143345
80.272854
47.89213
36.973837
73.55959
2.27445
false
llm-agents_tora-13b-v1.0_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/llm-agents/tora-13b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llm-agents/tora-13b-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-13b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llm-agents/tora-13b-v1.0
0636c1f582c979a5a292cc5f3dc293800b1494e2
53.621213
llama2
6
13
true
true
true
true
2023-10-16T13:19:55Z
false
58.959044
82.31428
54.73185
40.254468
75.611681
9.855951
false
llm-agents_tora-13b-v1.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/llm-agents/tora-13b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llm-agents/tora-13b-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-13b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llm-agents/tora-13b-v1.0
0636c1f582c979a5a292cc5f3dc293800b1494e2
55.372781
llama2
6
13
true
true
true
true
2024-01-04T12:50:47Z
false
58.959044
82.31428
54.594744
40.220405
75.374901
20.773313
false
llm-agents_tora-70b-v1.0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/llm-agents/tora-70b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llm-agents/tora-70b-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-70b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llm-agents/tora-70b-v1.0
e95fd7daf017e7c414ec07ebef4ddf013c16f9a4
63.386465
llama2
20
70
true
true
true
true
2023-10-16T13:00:29Z
false
67.74744
85.829516
69.22441
51.785702
81.925809
23.805914
false
llm-agents_tora-70b-v1.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/llm-agents/tora-70b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llm-agents/tora-70b-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-70b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llm-agents/tora-70b-v1.0
e95fd7daf017e7c414ec07ebef4ddf013c16f9a4
65.284339
llama2
20
70
true
true
true
true
2024-01-04T12:49:30Z
false
67.576792
85.819558
69.129235
51.763882
82.162589
35.25398
false
llm-agents_tora-7b-v1.0_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/llm-agents/tora-7b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llm-agents/tora-7b-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-7b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llm-agents/tora-7b-v1.0
717edbee98945192b1a396fc9c337c5b32d6c79c
48.50287
llama2
6
7
true
true
true
true
2023-10-16T12:54:17Z
false
52.474403
78.679546
45.90254
37.899245
73.55959
2.501895
false
llm-agents_tora-code-13b-v1.0_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/llm-agents/tora-code-13b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llm-agents/tora-code-13b-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-code-13b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llm-agents/tora-code-13b-v1.0
4bf5b528d95a507b435c24a8986afe80d5951782
42.695684
llama2
14
13
true
true
true
true
2023-10-16T12:46:18Z
false
44.453925
69.288986
36.670071
34.984306
62.588792
8.188021
false
llm-agents_tora-code-13b-v1.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/llm-agents/tora-code-13b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llm-agents/tora-code-13b-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-code-13b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llm-agents/tora-code-13b-v1.0
4bf5b528d95a507b435c24a8986afe80d5951782
44.185962
llama2
14
13
true
true
true
true
2024-01-04T12:51:54Z
false
44.709898
69.149572
36.687556
34.975612
63.141279
16.451857
false
llm-agents_tora-code-34b-v1.0_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/llm-agents/tora-code-34b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llm-agents/tora-code-34b-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-code-34b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llm-agents/tora-code-34b-v1.0
cbb33eea774cc03d4363c424d81e8c9d58332274
48.953479
llama2
14
34
true
true
true
true
2023-10-16T13:27:38Z
false
50.426621
75.542721
46.778957
39.663998
68.192581
13.115997
false
llm-agents_tora-code-34b-v1.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/llm-agents/tora-code-34b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llm-agents/tora-code-34b-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-code-34b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llm-agents/tora-code-34b-v1.0
cbb33eea774cc03d4363c424d81e8c9d58332274
49.919699
llama2
14
34
true
true
true
true
2024-01-04T12:50:14Z
false
50.255973
75.482972
46.654697
39.617814
67.719021
19.787718
false
llm-agents_tora-code-7b-v1.0_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/llm-agents/tora-code-7b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llm-agents/tora-code-7b-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-code-7b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llm-agents/tora-code-7b-v1.0
777501b69bb0ba2675abdcaf7b1309ab05320c2e
40.205083
llama2
16
7
true
true
true
true
2023-10-16T13:00:29Z
false
40.699659
65.863374
33.336573
34.840169
61.562747
4.927976
false
llm-jp_llm-jp-13b-instruct-full-jaster-dolly-oasst-v1.0_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/llm-jp/llm-jp-13b-instruct-full-jaster-dolly-oasst-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llm-jp/llm-jp-13b-instruct-full-jaster-dolly-oasst-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_llm-jp__llm-jp-13b-instruct-full-jaster-dolly-oasst-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llm-jp/llm-jp-13b-instruct-full-jaster-dolly-oasst-v1.0
68282fe744c69ea2e4420a4a6833c0b9168215eb
31.773109
apache-2.0
8
13
true
true
true
true
2023-11-27T01:58:48Z
false
26.877133
44.781916
23.116858
45.191873
50.670876
0
false
llm-jp_llm-jp-13b-instruct-full-jaster-v1.0_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/llm-jp/llm-jp-13b-instruct-full-jaster-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llm-jp/llm-jp-13b-instruct-full-jaster-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_llm-jp__llm-jp-13b-instruct-full-jaster-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llm-jp/llm-jp-13b-instruct-full-jaster-v1.0
b44eac954eac7ddbceba4f510325fd710c977eab
31.628498
apache-2.0
14
13
true
true
true
true
2023-11-27T01:58:24Z
false
27.21843
44.702251
23.116858
44.693984
50.039463
0
false
llmixer_BigWeave-v12-90b_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/llmixer/BigWeave-v12-90b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llmixer/BigWeave-v12-90b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_llmixer__BigWeave-v12-90b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llmixer/BigWeave-v12-90b
4518c1d85135efdb14ed8d3581d325ea2167d6b4
69.192968
llama2
2
87
true
false
true
true
2024-02-06T12:28:25Z
false
68.088737
87.701653
69.414365
61.353202
81.21547
47.384382
false
llmixer_BigWeave-v15-103b_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/llmixer/BigWeave-v15-103b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llmixer/BigWeave-v15-103b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_llmixer__BigWeave-v15-103b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llmixer/BigWeave-v15-103b
59004f5610548e626ad27cd4a7b92daa3ccfc9c8
71.666124
unknown
2
103
true
true
true
true
2024-02-06T14:08:23Z
false
69.709898
86.40709
71.249494
66.104063
80.347277
56.178923
false
llmixer_BigWeave-v16-103b_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/llmixer/BigWeave-v16-103b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llmixer/BigWeave-v16-103b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_llmixer__BigWeave-v16-103b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llmixer/BigWeave-v16-103b
a1f70cd042fc8b4c5767f597edbb0054e7cb14f9
72.020696
unknown
4
103
true
false
true
true
2024-02-06T22:31:36Z
false
65.870307
87.612029
73.223427
63.809493
80.426204
61.182714
false
llmixer_BigWeave-v20-110b_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/llmixer/BigWeave-v20-110b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llmixer/BigWeave-v20-110b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_llmixer__BigWeave-v20-110b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llmixer/BigWeave-v20-110b
1e363188df8256180530fc42688bdb6b3de66b0a
68.028234
unknown
0
110
true
true
true
true
2024-02-15T17:52:11Z
false
68.174061
88.53814
70.507811
62.474525
82.083662
36.391205
false
llmixer_BigWeave-v6-90b_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/llmixer/BigWeave-v6-90b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">llmixer/BigWeave-v6-90b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_llmixer__BigWeave-v6-90b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
llmixer/BigWeave-v6-90b
cf0355244f8cb18a0e3128e292219ccf774fe418
67.473407
llama2
0
87
true
true
true
true
2024-02-06T12:28:38Z
false
65.358362
87.213702
68.041703
57.95842
81.689029
44.579227
false
lmsys_longchat-13b-16k_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lmsys/longchat-13b-16k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lmsys/longchat-13b-16k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lmsys__longchat-13b-16k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lmsys/longchat-13b-16k
70e2e38b82f1e25d8b90b50fbfc2361123bef45f
49.637542
null
131
13
true
true
true
true
2023-09-09T10:52:17Z
false
53.583618
77.67377
45.237704
47.073518
70.086819
4.169826
true
lmsys_longchat-7b-v1.5-32k_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lmsys/longchat-7b-v1.5-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lmsys/longchat-7b-v1.5-32k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lmsys__longchat-7b-v1.5-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lmsys/longchat-7b-v1.5-32k
16deb633ef4d6a18d5750239edc5a85ffeaf3918
47.949466
57
7
true
true
true
true
2023-09-09T10:52:17Z
false
51.706485
74.965146
43.163028
44.419652
68.66614
4.776346
true
lmsys_vicuna-13b-delta-v1.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lmsys/vicuna-13b-delta-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lmsys/vicuna-13b-delta-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lmsys__vicuna-13b-delta-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lmsys/vicuna-13b-delta-v1.1
ffed4c7cf1b9814812078efbe29ec3f610ea39e7
53.281971
null
411
13
true
true
true
true
2023-09-09T10:52:17Z
false
52.730375
80.143398
51.900471
52.083668
74.191002
8.642911
true
lmsys_vicuna-13b-v1.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lmsys/vicuna-13b-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lmsys/vicuna-13b-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lmsys__vicuna-13b-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lmsys/vicuna-13b-v1.1
8c71dbe9221e83d2ec72e4dc08beccfc78b563c0
53.281971
null
97
13
true
true
true
true
2023-09-09T10:52:17Z
false
52.730375
80.143398
51.900471
52.083668
74.191002
8.642911
true
lmsys_vicuna-13b-v1.3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lmsys/vicuna-13b-v1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lmsys/vicuna-13b-v1.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lmsys__vicuna-13b-v1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lmsys/vicuna-13b-v1.3
7900eeb715a49affee9e6390f824e62eea3f3fb1
54.270636
null
191
13
true
true
true
true
2023-09-09T10:52:17Z
false
54.607509
80.412268
52.877673
52.138219
74.822415
10.765732
true
lmsys_vicuna-13b-v1.5_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lmsys/vicuna-13b-v1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lmsys/vicuna-13b-v1.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lmsys__vicuna-13b-v1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lmsys/vicuna-13b-v1.5
3deb0106f72a3a433f0c6ea0cb978bdf14bcd3a6
55.411542
llama2
195
13
true
true
true
true
2023-09-09T10:52:17Z
false
57.081911
81.238797
56.67295
51.514596
74.664562
11.296437
true
lmsys_vicuna-13b-v1.5-16k_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lmsys/vicuna-13b-v1.5-16k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lmsys/vicuna-13b-v1.5-16k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lmsys__vicuna-13b-v1.5-16k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lmsys/vicuna-13b-v1.5-16k
277697af19d4b267626ebc9f4e078d19a9a0fddf
54.973456
llama2
218
13
true
true
true
true
2023-10-16T12:48:18Z
false
56.740614
80.372436
55.275036
51.960965
72.375691
13.115997
true
lmsys_vicuna-33b-v1.3_float16
float16
?
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lmsys/vicuna-33b-v1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lmsys/vicuna-33b-v1.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lmsys__vicuna-33b-v1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lmsys/vicuna-33b-v1.3
ef8d6becf883fb3ce52e3706885f761819477ab4
58.541814
null
285
33
true
true
true
true
2023-10-16T12:48:18Z
false
62.116041
83.001394
59.216472
56.162103
77.03236
13.722517
true
lmsys_vicuna-7b-delta-v1.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lmsys/vicuna-7b-delta-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lmsys/vicuna-7b-delta-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lmsys__vicuna-7b-delta-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lmsys/vicuna-7b-delta-v1.1
24fb8e1e9cc78e0aa7ef154b026c4a83296e3fc4
50.369758
null
203
7
true
true
true
true
2023-09-09T10:52:17Z
false
53.668942
77.504481
45.606054
48.949563
70.955012
5.534496
true
lmsys_vicuna-7b-v1.3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lmsys/vicuna-7b-v1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lmsys/vicuna-7b-v1.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lmsys__vicuna-7b-v1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lmsys/vicuna-7b-v1.3
ac066c83424c4a7221aa10c0ebe074b24d3bcdb6
49.775766
null
123
7
true
true
true
true
2023-09-09T10:52:17Z
false
50.426621
76.916949
48.137167
47.006281
70.481452
5.686126
true
lmsys_vicuna-7b-v1.5_float16
float16
?
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lmsys/vicuna-7b-v1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lmsys/vicuna-7b-v1.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lmsys__vicuna-7b-v1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lmsys/vicuna-7b-v1.5
de56c35b1763eaae20f4d60efd64af0a9091ebe5
52.056621
llama2
255
7
true
true
true
true
2024-06-09T14:59:11Z
false
53.242321
77.394941
51.037449
50.338082
72.138911
8.188021
true
lmsys_vicuna-7b-v1.5_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lmsys/vicuna-7b-v1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lmsys/vicuna-7b-v1.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lmsys__vicuna-7b-v1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lmsys/vicuna-7b-v1.5
de56c35b1763eaae20f4d60efd64af0a9091ebe5
51.993716
llama2
255
7
true
true
true
true
2023-09-09T10:52:17Z
false
53.242321
77.394941
50.823143
50.329702
72.059984
8.112206
true
lmsys_vicuna-7b-v1.5-16k_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lmsys/vicuna-7b-v1.5-16k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lmsys/vicuna-7b-v1.5-16k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lmsys__vicuna-7b-v1.5-16k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lmsys/vicuna-7b-v1.5-16k
9a93d7d11fac7f3f9074510b80092b53bc1a5bec
51.422542
llama2
83
7
true
true
true
true
2023-09-09T10:52:17Z
false
54.180887
77.305318
49.297218
50.349427
71.033938
6.368461
true
lmsys_vicuna-7b-v1.5-16k_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lmsys/vicuna-7b-v1.5-16k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lmsys/vicuna-7b-v1.5-16k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lmsys__vicuna-7b-v1.5-16k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lmsys/vicuna-7b-v1.5-16k
9a93d7d11fac7f3f9074510b80092b53bc1a5bec
51.581579
llama2
83
7
true
true
true
true
2023-09-09T10:52:17Z
false
54.692833
77.315276
49.511445
50.412782
71.112865
6.444276
true
localfultonextractor_Erosumika-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/localfultonextractor/Erosumika-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">localfultonextractor/Erosumika-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_localfultonextractor__Erosumika-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
localfultonextractor/Erosumika-7B
fe495970cdaf66b16b2dc77567adb7bf3fe7fe90
64.922705
0
7
false
true
true
true
2024-03-03T07:52:17Z
false
62.883959
85.899223
60.644846
67.587055
75.295975
37.225171
false
localfultonextractor_Erosumika-7B-v2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/localfultonextractor/Erosumika-7B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">localfultonextractor/Erosumika-7B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_localfultonextractor__Erosumika-7B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
localfultonextractor/Erosumika-7B-v2
d391a01d8277f80b159ca4c06a4316b771241be6
67.644506
0
7
false
true
true
true
2024-03-24T18:50:55Z
false
65.614334
86.287592
62.50926
69.000964
77.26914
45.185747
false
localfultonextractor_Erosumika-7B-v3_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/localfultonextractor/Erosumika-7B-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">localfultonextractor/Erosumika-7B-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_localfultonextractor__Erosumika-7B-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
localfultonextractor/Erosumika-7B-v3
d80884197f744524ba44fb587944e7bde053e249
69.801704
0
7
false
true
true
true
2024-03-26T18:59:19Z
false
67.491468
85.690102
64.154277
62.12238
82.794002
56.557998
false
localfultonextractor_Erosumika-7B-v3-0.2_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/localfultonextractor/Erosumika-7B-v3-0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">localfultonextractor/Erosumika-7B-v3-0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_localfultonextractor__Erosumika-7B-v3-0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
localfultonextractor/Erosumika-7B-v3-0.2
a634a34eb846fb891c58e45b82997c56abdac4c1
65.649462
0
7
false
true
true
true
2024-03-26T21:21:47Z
false
67.74744
84.953197
60.001525
55.766543
81.531176
43.896892
false
lodrick-the-lafted_Fuselage-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lodrick-the-lafted/Fuselage-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lodrick-the-lafted/Fuselage-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Fuselage-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lodrick-the-lafted/Fuselage-8B
e3e160727514c9f1eb3f488f2768ddd59d459c1c
65.540482
apache-2.0
1
8
true
true
true
true
2024-05-11T13:14:11Z
false
60.068259
77.713603
64.959026
50.330159
74.743489
65.428355
false
lodrick-the-lafted_Grafted-Hermetic-Platypus-A-2x7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/lodrick-the-lafted/Grafted-Hermetic-Platypus-A-2x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lodrick-the-lafted/Grafted-Hermetic-Platypus-A-2x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Grafted-Hermetic-Platypus-A-2x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lodrick-the-lafted/Grafted-Hermetic-Platypus-A-2x7B
6e102b60cde5dc38374bf4906a8cdeb0411321f0
64.232294
apache-2.0
0
12
true
true
false
true
2024-03-02T10:47:15Z
false
59.300341
82.891854
62.003533
61.077856
77.663773
42.456406
false
lodrick-the-lafted_Grafted-Hermetic-Platypus-B-2x7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/lodrick-the-lafted/Grafted-Hermetic-Platypus-B-2x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lodrick-the-lafted/Grafted-Hermetic-Platypus-B-2x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Grafted-Hermetic-Platypus-B-2x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lodrick-the-lafted/Grafted-Hermetic-Platypus-B-2x7B
1972aa1c2ad8f1b808efa9bce98ec154cd361264
64.651867
apache-2.0
0
12
true
true
false
true
2024-03-02T23:24:52Z
false
59.47099
82.951603
62.146416
61.487601
77.426993
44.427597
false
lodrick-the-lafted_Grafted-Hermetic-Platypus-C-2x7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/lodrick-the-lafted/Grafted-Hermetic-Platypus-C-2x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lodrick-the-lafted/Grafted-Hermetic-Platypus-C-2x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Grafted-Hermetic-Platypus-C-2x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lodrick-the-lafted/Grafted-Hermetic-Platypus-C-2x7B
778903b24f320ce4e46d9e43ff296a64a6b835b6
64.386327
apache-2.0
0
12
true
true
false
true
2024-03-03T08:25:52Z
false
58.959044
82.772356
62.075375
60.871594
77.742699
43.896892
false
lodrick-the-lafted_Grafted-Hermetic-Platypus-C-2x7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/lodrick-the-lafted/Grafted-Hermetic-Platypus-C-2x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lodrick-the-lafted/Grafted-Hermetic-Platypus-C-2x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Grafted-Hermetic-Platypus-C-2x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lodrick-the-lafted/Grafted-Hermetic-Platypus-C-2x7B
777b95105f6e8e5a493cb3b38a21a6534a24d784
64.560922
apache-2.0
0
12
true
true
false
true
2024-03-22T14:21:39Z
false
59.300341
82.75244
62.24325
60.807833
78.137332
44.124337
false
lodrick-the-lafted_Grafted-Hermetic-Platypus-D-2x7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/lodrick-the-lafted/Grafted-Hermetic-Platypus-D-2x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lodrick-the-lafted/Grafted-Hermetic-Platypus-D-2x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Grafted-Hermetic-Platypus-D-2x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lodrick-the-lafted/Grafted-Hermetic-Platypus-D-2x7B
d64cb44e12b446b1e532ecd6a8f6f8c60e1ee095
64.242692
apache-2.0
0
12
true
true
false
true
2024-03-02T14:31:16Z
false
58.87372
82.891854
61.955108
61.018106
77.426993
43.290371
false
lodrick-the-lafted_Grafted-Llama2-2x70B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/lodrick-the-lafted/Grafted-Llama2-2x70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lodrick-the-lafted/Grafted-Llama2-2x70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Grafted-Llama2-2x70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lodrick-the-lafted/Grafted-Llama2-2x70B
68b4f64541479fb6f6691de1fb2f4db07e1634e2
73.773956
llama2
3
125
true
false
false
true
2024-01-19T20:56:13Z
false
72.610922
89.57379
71.671044
66.492779
84.372534
57.922669
false
lodrick-the-lafted_Grafted-Wind-Elementals-2x70B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/lodrick-the-lafted/Grafted-Wind-Elementals-2x70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lodrick-the-lafted/Grafted-Wind-Elementals-2x70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Grafted-Wind-Elementals-2x70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lodrick-the-lafted/Grafted-Wind-Elementals-2x70B
46f056338f51bcc7c80745b95e9198aec4c198d4
76.211465
other
0
125
true
false
false
true
2024-02-07T23:54:48Z
false
73.37884
89.075881
75.787229
65.568164
84.846093
68.612585
false
lodrick-the-lafted_Hermes-Instruct-7B-100K_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/lodrick-the-lafted/Hermes-Instruct-7B-100K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lodrick-the-lafted/Hermes-Instruct-7B-100K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Hermes-Instruct-7B-100K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lodrick-the-lafted/Hermes-Instruct-7B-100K
0dd712293d5b914d53f1e1f35922cd023ba98047
64.962731
apache-2.0
2
7
true
true
true
true
2024-02-20T07:06:49Z
false
61.518771
82.842063
60.946209
63.622129
76.874507
43.972707
false
lodrick-the-lafted_Hermes-Instruct-7B-217K_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/lodrick-the-lafted/Hermes-Instruct-7B-217K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lodrick-the-lafted/Hermes-Instruct-7B-217K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Hermes-Instruct-7B-217K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lodrick-the-lafted/Hermes-Instruct-7B-217K
25d52e51192738ddfc875e70dbaf1602ad4afd8f
64.809639
apache-2.0
0
7
true
true
true
true
2024-02-21T02:43:47Z
false
61.006826
82.6429
61.227877
61.813049
77.663773
44.503412
false
lodrick-the-lafted_Hermes-Instruct-7B-v0.2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/lodrick-the-lafted/Hermes-Instruct-7B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lodrick-the-lafted/Hermes-Instruct-7B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Hermes-Instruct-7B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lodrick-the-lafted/Hermes-Instruct-7B-v0.2
6675073736e1f611aaf48ef9777076183d233c96
63.818392
apache-2.0
1
7
true
true
true
true
2024-02-11T19:15:12Z
false
60.921502
82.961561
60.048352
61.012696
76.874507
41.091736
false
lodrick-the-lafted_Kaiju-A-57B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lodrick-the-lafted/Kaiju-A-57B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lodrick-the-lafted/Kaiju-A-57B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Kaiju-A-57B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lodrick-the-lafted/Kaiju-A-57B
11fc415ccc69d9f5a72be7f90be0b48b9c782f67
63.636829
other
5
57
true
true
true
true
2024-01-26T17:04:41Z
false
58.788396
80.95001
72.658101
52.293325
78.768745
38.362396
false
lodrick-the-lafted_Kudzu-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lodrick-the-lafted/Kudzu-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lodrick-the-lafted/Kudzu-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Kudzu-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lodrick-the-lafted/Kudzu-8B
6f208067b52df83c6f70e49e7e91fcaa7a9d5b2e
67.254151
apache-2.0
5
8
true
false
true
true
2024-05-11T20:09:58Z
false
62.457338
80.282812
68.136094
52.774994
76.79558
63.078089
false
lodrick-the-lafted_Olethros-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lodrick-the-lafted/Olethros-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lodrick-the-lafted/Olethros-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Olethros-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lodrick-the-lafted/Olethros-8B
9db251f216c61bafdc688cccc60f42742a274436
68.093961
llama3
1
8
true
true
true
true
2024-04-29T08:08:08Z
false
61.860068
81.637124
67.983975
52.505285
76.874507
67.702805
false
lodrick-the-lafted_Platyboros-Instruct-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/lodrick-the-lafted/Platyboros-Instruct-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lodrick-the-lafted/Platyboros-Instruct-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Platyboros-Instruct-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lodrick-the-lafted/Platyboros-Instruct-7B
166c6ba6e9fb6fcb011d98c5cdbe68d17953d3d0
64.188645
apache-2.0
0
7
true
true
true
true
2024-02-22T00:59:03Z
false
57.764505
82.593109
62.049714
60.917761
78.137332
43.669447
false
lodrick-the-lafted_Winged-Lagomorph-2x13B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/lodrick-the-lafted/Winged-Lagomorph-2x13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lodrick-the-lafted/Winged-Lagomorph-2x13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lodrick-the-lafted__Winged-Lagomorph-2x13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lodrick-the-lafted/Winged-Lagomorph-2x13B
f3959f69559f531fb9202798baf641b4af90c1bb
49.901049
llama2
3
21
true
false
false
true
2024-01-17T11:47:44Z
false
47.952218
69.388568
44.500052
44.536666
67.403315
25.625474
false
logicker_SkkuDS-DPO-72B-v1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/logicker/SkkuDS-DPO-72B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">logicker/SkkuDS-DPO-72B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_logicker__SkkuDS-DPO-72B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
logicker/SkkuDS-DPO-72B-v1
5e194e1e44c6c2ebe294f854733f5c5532de5688
72.890483
other
0
72
true
true
true
true
2024-02-15T12:04:58Z
false
65.955631
85.998805
77.325799
59.543268
82.636148
65.883245
false
logicker_SkkuDS-DPO-72B-v3_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/logicker/SkkuDS-DPO-72B-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">logicker/SkkuDS-DPO-72B-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_logicker__SkkuDS-DPO-72B-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
logicker/SkkuDS-DPO-72B-v3
5cf11f6e983a7c11b17c1b7c4aee6ff99e30ba82
72.803368
other
0
72
true
true
true
true
2024-02-27T17:10:45Z
false
66.040956
86.108345
77.336054
59.725243
82.636148
64.973465
false
logicker_SkkuDataScienceGlobal-10.7b_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Unknown
<a target="_blank" href="https://huggingface.co/logicker/SkkuDataScienceGlobal-10.7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">logicker/SkkuDataScienceGlobal-10.7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_logicker__SkkuDataScienceGlobal-10.7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
logicker/SkkuDataScienceGlobal-10.7b
4f5e40b38099084b86fb18b294e4e61e7d20cc7c
74.495332
0
10
false
true
true
true
2024-01-02T16:54:15Z
false
71.245734
88.408684
66.314602
71.924873
83.346488
65.731615
false
lole25_phi-2-sft-lora-ultrachat_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/lole25/phi-2-sft-lora-ultrachat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lole25/phi-2-sft-lora-ultrachat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lole25__phi-2-sft-lora-ultrachat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lole25/phi-2-sft-lora-ultrachat
09f410606332b5d29075d7031420291e257de570
61.0678
mit
0
0
true
true
true
true
2024-03-12T23:25:15Z
false
61.262799
74.855606
57.264281
45.459343
74.191002
53.373768
false
lole25_phi-2-sft-ultrachat-full_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/lole25/phi-2-sft-ultrachat-full" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lole25/phi-2-sft-ultrachat-full</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lole25__phi-2-sft-ultrachat-full" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lole25/phi-2-sft-ultrachat-full
67c2b0e28a60da6760002e8e77e639063cb9279d
60.660992
mit
1
2
true
true
true
true
2024-04-08T07:19:28Z
false
60.836177
74.61661
56.393556
46.064912
74.348856
51.705838
false
lole25_zephyr-7b-gpo-v6-i1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/lole25/zephyr-7b-gpo-v6-i1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lole25/zephyr-7b-gpo-v6-i1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lole25__zephyr-7b-gpo-v6-i1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lole25/zephyr-7b-gpo-v6-i1
d39dc9bab954fc2b18ba1b3d735d02b64ff8ea0b
63.656926
mit
0
7
true
true
true
true
2024-05-07T15:19:46Z
false
65.614334
85.829516
62.96062
56.13677
79.558011
31.842305
false
lorinma_yi6B_Vicuna_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lorinma/yi6B_Vicuna" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lorinma/yi6B_Vicuna</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lorinma__yi6B_Vicuna" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lorinma/yi6B_Vicuna
4ba7237cc904a14240f426154dc5233ef47db9e4
51.015256
mit
1
6
true
true
true
true
2024-01-10T12:16:43Z
false
46.16041
69.298944
58.429533
48.11267
65.66693
18.423048
false
louisbrulenaudet_Maxine-34B-stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/louisbrulenaudet/Maxine-34B-stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">louisbrulenaudet/Maxine-34B-stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Maxine-34B-stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
louisbrulenaudet/Maxine-34B-stock
5d87d746433f6eaddf34fd1dbdeed859b15348aa
77.278206
apache-2.0
3
34
false
true
true
true
2024-04-04T20:31:00Z
false
74.061433
86.73571
76.619729
70.177501
83.898974
72.175891
false
louisbrulenaudet_Maxine-7B-0401-stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/louisbrulenaudet/Maxine-7B-0401-stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">louisbrulenaudet/Maxine-7B-0401-stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Maxine-7B-0401-stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
louisbrulenaudet/Maxine-7B-0401-stock
a23c75b9b6d9c47bdd106af999f6a33c981e2bd6
76.733314
apache-2.0
1
7
false
true
true
true
2024-04-01T18:16:03Z
false
73.122867
89.125672
64.418016
78.069792
85.003946
70.659591
false
louisbrulenaudet_Maxine-7B-0401-ties_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/louisbrulenaudet/Maxine-7B-0401-ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">louisbrulenaudet/Maxine-7B-0401-ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Maxine-7B-0401-ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
louisbrulenaudet/Maxine-7B-0401-ties
f317d8b1807e65236b62f4b98e4b460754a6c82e
75.955795
0
7
false
true
true
true
2024-04-01T19:23:20Z
false
71.757679
88.836885
64.3481
74.514692
83.267561
73.009856
false
louisbrulenaudet_Pearl-34B-dare_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/louisbrulenaudet/Pearl-34B-dare" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">louisbrulenaudet/Pearl-34B-dare</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Pearl-34B-dare" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
louisbrulenaudet/Pearl-34B-dare
8c37fc9bad0de353a597b133a1570b556211c01b
73.706555
0
34
false
true
true
true
2024-02-13T00:07:19Z
false
68.430034
83.608843
76.398155
68.501363
81.767956
63.53298
false
louisbrulenaudet_Pearl-34B-ties_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/louisbrulenaudet/Pearl-34B-ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">louisbrulenaudet/Pearl-34B-ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Pearl-34B-ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
louisbrulenaudet/Pearl-34B-ties
be28f8663c6f49e1df04ddd59f4475cb93575272
75.48084
apache-2.0
3
34
false
true
true
true
2024-02-15T16:43:44Z
false
70.989761
84.833698
76.629847
70.320225
82.636148
67.47536
false
louisbrulenaudet_Pearl-3x7B_float16
float16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/louisbrulenaudet/Pearl-3x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">louisbrulenaudet/Pearl-3x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Pearl-3x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
louisbrulenaudet/Pearl-3x7B
63499a3e77b66d0709c15208720d48e89b4c1786
67.227474
apache-2.0
1
18
true
false
false
true
2024-02-08T16:33:27Z
false
65.52901
85.540729
64.273431
52.167336
78.689818
57.164519
false
louisbrulenaudet_Pearl-7B-0210-dare_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/louisbrulenaudet/Pearl-7B-0210-dare" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">louisbrulenaudet/Pearl-7B-0210-dare</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0210-dare" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
louisbrulenaudet/Pearl-7B-0210-dare
b29298bbe30bba7c6aef25ef21cb9f4d470a10e2
73.460702
0
7
false
true
true
true
2024-02-11T15:32:09Z
false
70.904437
88.797052
61.686381
71.464604
84.530387
63.38135
false
louisbrulenaudet_Pearl-7B-0210-ties_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/louisbrulenaudet/Pearl-7B-0210-ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">louisbrulenaudet/Pearl-7B-0210-ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0210-ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
louisbrulenaudet/Pearl-7B-0210-ties
d18d0fe9d70b8a2f4e2af33b6e771c8edef6ff97
74.656433
apache-2.0
0
7
false
true
true
true
2024-02-11T11:56:56Z
false
71.075085
88.627763
63.813332
70.46726
83.977901
69.977255
false
louisbrulenaudet_Pearl-7B-0211-ties_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/louisbrulenaudet/Pearl-7B-0211-ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">louisbrulenaudet/Pearl-7B-0211-ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-0211-ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
louisbrulenaudet/Pearl-7B-0211-ties
15db1e92e1683166a32da6f54c6ee6d6c10c20cb
75.112315
apache-2.0
3
7
false
true
true
true
2024-02-11T16:37:30Z
false
71.416382
88.856801
63.908211
71.460373
84.372534
70.659591
false
louisbrulenaudet_Pearl-7B-slerp_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/louisbrulenaudet/Pearl-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">louisbrulenaudet/Pearl-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Pearl-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
louisbrulenaudet/Pearl-7B-slerp
b4fef0d4a79ed1e5441d6a0d8fb06e0eda223d9e
72.745379
apache-2.0
6
7
false
true
true
true
2024-02-08T20:35:12Z
false
68.003413
87.163912
64.041679
62.352495
81.294396
73.616376
false
louisgrc_Marengoli_7B_SLERP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/louisgrc/Marengoli_7B_SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">louisgrc/Marengoli_7B_SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_louisgrc__Marengoli_7B_SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
louisgrc/Marengoli_7B_SLERP
86b0adb1715855794161ba18db1c115f7ffa6ad7
76.424828
apache-2.0
0
7
true
false
true
true
2024-03-25T06:27:41Z
false
73.634812
89.235212
64.678988
77.22868
85.082873
68.6884
false
louisgrc_Montebello_7B_SLERP_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/louisgrc/Montebello_7B_SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">louisgrc/Montebello_7B_SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_louisgrc__Montebello_7B_SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
louisgrc/Montebello_7B_SLERP
1097b6038dc48f86382cacb1a27c76faacf8f607
76.496727
apache-2.0
0
7
true
false
true
true
2024-03-26T11:16:11Z
false
72.952218
89.065923
64.559055
79.326674
84.767167
68.309325
false
lqtrung1998_Codellama-7b-hf-ReFT-GSM8k_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lqtrung1998/Codellama-7b-hf-ReFT-GSM8k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lqtrung1998/Codellama-7b-hf-ReFT-GSM8k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lqtrung1998__Codellama-7b-hf-ReFT-GSM8k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lqtrung1998/Codellama-7b-hf-ReFT-GSM8k
a97add0e026abe7ef5c58e0af0ec79f39eb58876
45.693605
llama2
0
7
true
true
true
true
2024-03-04T23:06:31Z
false
43.515358
64.528978
40.857658
37.283284
64.246251
23.730099
false
lqtrung1998_Codellama-7b-hf-ReFT-Rerank-GSM8k_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForSequenceClassification
<a target="_blank" href="https://huggingface.co/lqtrung1998/Codellama-7b-hf-ReFT-Rerank-GSM8k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lqtrung1998/Codellama-7b-hf-ReFT-Rerank-GSM8k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lqtrung1998__Codellama-7b-hf-ReFT-Rerank-GSM8k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lqtrung1998/Codellama-7b-hf-ReFT-Rerank-GSM8k
b863eff60d154ed4d68349f75550377f9ff7fefc
30.17957
llama2
1
6
true
true
true
true
2024-03-04T23:05:45Z
false
29.266212
26.130253
24.643024
49.97242
51.065509
0
false
lqtrung1998_galactica-6.7b-ReFT-GSM8k_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
OPTForCausalLM
<a target="_blank" href="https://huggingface.co/lqtrung1998/galactica-6.7b-ReFT-GSM8k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lqtrung1998/galactica-6.7b-ReFT-GSM8k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lqtrung1998__galactica-6.7b-ReFT-GSM8k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lqtrung1998/galactica-6.7b-ReFT-GSM8k
db019ea6f2762330d09f28bca53a5ecee8e2819a
38.145953
cc-by-nc-4.0
0
6
true
true
true
true
2024-03-04T23:08:33Z
false
40.699659
50.338578
37.619523
41.208865
58.326756
0.682335
false
lqtrung1998_galactica-6.7b-ReFT-Rerank-GSM8k_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
OPTForSequenceClassification
<a target="_blank" href="https://huggingface.co/lqtrung1998/galactica-6.7b-ReFT-Rerank-GSM8k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lqtrung1998/galactica-6.7b-ReFT-Rerank-GSM8k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lqtrung1998__galactica-6.7b-ReFT-Rerank-GSM8k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lqtrung1998/galactica-6.7b-ReFT-Rerank-GSM8k
13f88bef7068492879a32eeee42597cc37fc727e
36.861219
cc-by-nc-4.0
0
6
true
true
true
true
2024-03-04T23:08:05Z
false
41.12628
48.775144
32.861167
41.195387
56.906077
0.30326
false
lu-vae_llama2-13B-sharegpt4-orca-openplatypus-8w_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lu-vae/llama2-13B-sharegpt4-orca-openplatypus-8w" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lu-vae/llama2-13B-sharegpt4-orca-openplatypus-8w</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lu-vae__llama2-13B-sharegpt4-orca-openplatypus-8w" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lu-vae/llama2-13B-sharegpt4-orca-openplatypus-8w
ad086aacf0176911133b6cccfb34364afce9de5a
55.752926
llama2
0
13
false
true
true
true
2023-10-16T12:48:18Z
false
62.798635
84.037044
55.134935
45.65749
75.138122
11.751327
false
lu-vae_llama2-13b-sharegpt4-test_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/lu-vae/llama2-13b-sharegpt4-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">lu-vae/llama2-13b-sharegpt4-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_lu-vae__llama2-13b-sharegpt4-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
lu-vae/llama2-13b-sharegpt4-test
2be36a2dab4ed0f97727a1508367f53d59950818
55.689155
llama2
0
13
false
true
true
true
2023-10-16T12:48:18Z
false
58.020478
82.652858
55.988782
48.271572
76.085241
13.115997
false
luffycodes_higgs-llama-vicuna-ep25-70b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/luffycodes/higgs-llama-vicuna-ep25-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">luffycodes/higgs-llama-vicuna-ep25-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__higgs-llama-vicuna-ep25-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
luffycodes/higgs-llama-vicuna-ep25-70b
1da59e150f1d0bae67f66400738a01d408a8c45d
63.598586
llama2
2
70
true
true
true
true
2023-11-06T10:31:15Z
false
62.286689
86.068512
64.25195
53.749738
80.662983
34.571645
false
luffycodes_llama-class-shishya-7b-ep3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/luffycodes/llama-class-shishya-7b-ep3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">luffycodes/llama-class-shishya-7b-ep3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__llama-class-shishya-7b-ep3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
luffycodes/llama-class-shishya-7b-ep3
92802ec9c58b1ed64d758c0f0c8420f4000636ff
43.883073
llama2
0
7
true
true
true
true
2023-12-14T12:29:18Z
false
40.784983
77.036447
46.743004
27.936842
70.797159
0
false
luffycodes_llama-shishya-7b-ep3-v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/luffycodes/llama-shishya-7b-ep3-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">luffycodes/llama-shishya-7b-ep3-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__llama-shishya-7b-ep3-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
luffycodes/llama-shishya-7b-ep3-v1
8dc109f45ef36cc7bbd0f5d83fb65ac8e768d1bd
45.189851
llama2
0
7
true
true
true
true
2023-10-14T00:33:49Z
false
48.037543
76.628162
46.119297
30.898696
69.455406
0
false
luffycodes_llama-shishya-7b-ep3-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/luffycodes/llama-shishya-7b-ep3-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">luffycodes/llama-shishya-7b-ep3-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__llama-shishya-7b-ep3-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
luffycodes/llama-shishya-7b-ep3-v2
679c6cb9e869df686b1ae415ed440e6cfc05f80b
44.330368
llama2
0
7
true
true
true
true
2023-10-14T12:16:07Z
false
47.354949
75.881299
43.837845
30.163048
68.745067
0
false
luffycodes_mcq-hal-vicuna-13b-v1.5_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/luffycodes/mcq-hal-vicuna-13b-v1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">luffycodes/mcq-hal-vicuna-13b-v1.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__mcq-hal-vicuna-13b-v1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
luffycodes/mcq-hal-vicuna-13b-v1.5
bb3029bce8347b09c2fd6908475b195bcabe53e3
52.702928
0
13
false
true
true
true
2023-10-16T12:48:18Z
false
55.972696
80.720972
52.849842
45.033375
72.770324
8.870356
false
luffycodes_mcq-vicuna-13b-v1.5_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/luffycodes/mcq-vicuna-13b-v1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">luffycodes/mcq-vicuna-13b-v1.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__mcq-vicuna-13b-v1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
luffycodes/mcq-vicuna-13b-v1.5
f769a92cfeffe8ee07beee8814ce7eca7cd62805
52.545327
0
13
false
true
true
true
2023-10-16T12:48:18Z
false
56.228669
81.149173
53.380719
44.079538
72.928177
7.505686
false