eval_name
stringlengths
9
97
Precision
stringclasses
5 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
3 values
Architecture
stringclasses
53 values
Model
stringlengths
355
611
fullname
stringlengths
4
89
Model sha
stringlengths
0
40
Average ⬆️
float64
27
81.3
Hub License
stringclasses
35 values
Hub ❤️
int64
0
4.88k
#Params (B)
int64
0
238
Available on the hub
bool
2 classes
Merged
bool
2 classes
MoE
bool
2 classes
Flagged
bool
1 class
date
stringlengths
0
26
Chat Template
bool
2 classes
ARC
float64
19.7
87.5
HellaSwag
float64
20.7
92.8
MMLU
float64
17.8
89.4
TruthfulQA
float64
27.9
82.3
Winogrande
float64
47.2
91.5
GSM8K
float64
0
88.2
Maintainers Choice
bool
2 classes
TurkuNLP_gpt3-finnish-large_float16
float16
🟢 pretrained
🟢
Original
BloomModel
<a target="_blank" href="https://huggingface.co/TurkuNLP/gpt3-finnish-large" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TurkuNLP/gpt3-finnish-large</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_TurkuNLP__gpt3-finnish-large" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TurkuNLP/gpt3-finnish-large
b9a3dd97387fc70d07010d469888a918842d3449
29.106786
apache-2.0
4
0
true
true
true
true
2023-10-16T12:46:18Z
false
21.757679
32.881896
24.112185
44.349887
51.539069
0
false
TurkuNLP_gpt3-finnish-small_float16
float16
🟢 pretrained
🟢
Original
BloomModel
<a target="_blank" href="https://huggingface.co/TurkuNLP/gpt3-finnish-small" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TurkuNLP/gpt3-finnish-small</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_TurkuNLP__gpt3-finnish-small" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TurkuNLP/gpt3-finnish-small
20a19af481bf59f38610a2977b2b513e9df51e3a
27.954447
apache-2.0
9
0
true
true
true
true
2023-10-16T13:19:55Z
false
20.477816
28.092014
24.467397
46.465302
48.224152
0
false
TwT-6_cr-model_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/TwT-6/cr-model" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TwT-6/cr-model</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_TwT-6__cr-model" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TwT-6/cr-model
b9fa8eab366f58c8c13ab40234dc90ed19e47060
68.092619
cc-by-nc-4.0
1
14
true
true
true
true
2024-05-09T03:26:55Z
false
57.849829
81.65704
68.725783
58.196724
76.243094
65.883245
false
TwT-6_cr-model-v1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/TwT-6/cr-model-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TwT-6/cr-model-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_TwT-6__cr-model-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TwT-6/cr-model-v1
4b9fdd5c5f6efe32c6cb1b7636c897610c9d8b65
77.321552
cc-by-4.0
2
14
true
true
true
true
2024-05-29T02:42:42Z
false
70.648464
87.851026
74.731803
80.470246
83.662194
66.56558
false
TwT-6_open_llm_leaderboard_demo_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/TwT-6/open_llm_leaderboard_demo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TwT-6/open_llm_leaderboard_demo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_TwT-6__open_llm_leaderboard_demo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TwT-6/open_llm_leaderboard_demo
dfb776c59e06792b11df7306a46f9612061131f9
67.918119
0
14
false
true
true
true
2024-04-16T09:07:05Z
false
58.105802
81.627166
68.528888
58.188156
76.085241
64.973465
false
TwT-6_open_llm_leaderboard_demo_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/TwT-6/open_llm_leaderboard_demo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TwT-6/open_llm_leaderboard_demo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_TwT-6__open_llm_leaderboard_demo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TwT-6/open_llm_leaderboard_demo
dfb776c59e06792b11df7306a46f9612061131f9
68.018745
0
14
false
true
true
true
2024-04-15T06:10:48Z
false
58.020478
81.587333
68.707045
58.214424
75.927388
65.6558
false
TwT-6_open_llm_leaderboard_demo2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/TwT-6/open_llm_leaderboard_demo2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TwT-6/open_llm_leaderboard_demo2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_TwT-6__open_llm_leaderboard_demo2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TwT-6/open_llm_leaderboard_demo2
9eb73e7046c9784cd33d0c5d30828cfb087893ea
57.253263
cc-by-nc-4.0
1
10
true
true
true
true
2024-04-12T06:18:19Z
false
62.372014
83.758215
65.65131
52.495735
79.242305
0
false
TwT-6_open_llm_leaderboard_demo2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/TwT-6/open_llm_leaderboard_demo2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">TwT-6/open_llm_leaderboard_demo2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_TwT-6__open_llm_leaderboard_demo2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
TwT-6/open_llm_leaderboard_demo2
9eb73e7046c9784cd33d0c5d30828cfb087893ea
57.221831
cc-by-nc-4.0
1
10
true
true
true
true
2024-04-16T03:00:13Z
false
62.201365
83.758215
65.519516
52.454845
79.321231
0.075815
false
UCLA-AGI_test_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/UCLA-AGI/test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/test
437d2f9c55aec50ebaedce22df8aaa7fcc0f9ff8
62.856721
0
7
false
true
true
true
2024-01-04T07:02:55Z
false
65.870307
85.441147
60.945497
57.388938
76.637727
30.85671
false
UCLA-AGI_test-test_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/UCLA-AGI/test-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/test-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__test-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/test-test
83731d11da3f0878effd3a32e5aea52249de7c81
63.519426
0
7
false
true
true
true
2024-01-05T21:20:16Z
false
66.382253
85.839474
61.221324
57.822583
76.79558
33.055345
false
UCLA-AGI_test-test_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/UCLA-AGI/test-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/test-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__test-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/test-test
83731d11da3f0878effd3a32e5aea52249de7c81
63.535942
0
7
false
true
true
true
2024-01-06T01:47:03Z
false
66.467577
85.819558
61.47711
57.745889
76.953433
32.752085
false
UCLA-AGI_test0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/UCLA-AGI/test0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/test0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__test0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/test0
e90506303f046ebe6da9d8b41489a7365b455a06
62.374626
0
7
false
true
true
true
2024-01-05T10:30:53Z
false
63.651877
84.435371
61.005998
50.480562
77.979479
36.694466
false
UCLA-AGI_test_final_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/test_final" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/test_final</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__test_final" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/test_final
b996460b9ac3969f2c685c3f3669ba944022b2be
63.702539
0
7
false
true
true
true
2024-01-11T18:41:31Z
false
66.12628
85.849432
61.514577
57.894647
76.637727
34.19257
false
UCLA-AGI_zephyr-7b-sft-full-SPIN-iter0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/zephyr-7b-sft-full-SPIN-iter0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/zephyr-7b-sft-full-SPIN-iter0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/zephyr-7b-sft-full-SPIN-iter0
d457f58ca73bd5540dc4e12b70315e4464ea138c
62.374626
mit
7
7
true
true
true
true
2024-01-11T18:39:55Z
false
63.651877
84.435371
61.005998
50.480562
77.979479
36.694466
false
UCLA-AGI_zephyr-7b-sft-full-SPIN-iter0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/zephyr-7b-sft-full-SPIN-iter0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/zephyr-7b-sft-full-SPIN-iter0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/zephyr-7b-sft-full-SPIN-iter0
d457f58ca73bd5540dc4e12b70315e4464ea138c
62.318695
mit
7
7
true
true
true
true
2024-01-11T18:39:38Z
false
63.566553
84.425413
61.283068
50.342268
77.979479
36.31539
false
UCLA-AGI_zephyr-7b-sft-full-SPIN-iter1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/zephyr-7b-sft-full-SPIN-iter1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/zephyr-7b-sft-full-SPIN-iter1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/zephyr-7b-sft-full-SPIN-iter1
d8569aea49f28131ca3d319da343da0777ed4161
62.856721
mit
2
7
true
true
true
true
2024-01-05T05:04:03Z
false
65.870307
85.441147
60.945497
57.388938
76.637727
30.85671
false
UCLA-AGI_zephyr-7b-sft-full-SPIN-iter2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/zephyr-7b-sft-full-SPIN-iter2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/zephyr-7b-sft-full-SPIN-iter2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/zephyr-7b-sft-full-SPIN-iter2
336bff60f5ce904c2ab9633315192df904431afa
63.519426
mit
4
7
true
true
true
true
2024-01-11T18:39:08Z
false
66.382253
85.839474
61.221324
57.822583
76.79558
33.055345
false
UCLA-AGI_zephyr-7b-sft-full-SPIN-iter3_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/UCLA-AGI/zephyr-7b-sft-full-SPIN-iter3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/zephyr-7b-sft-full-SPIN-iter3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/zephyr-7b-sft-full-SPIN-iter3
f4d6d3b9fce399c73c727eb5f7e68a10ae751ad4
63.702539
mit
26
7
true
true
true
true
2024-01-20T00:44:11Z
false
66.12628
85.849432
61.514577
57.894647
76.637727
34.19257
false
UCLA-AGI_zephyr-7b-sft-full-spin-iter1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/UCLA-AGI/zephyr-7b-sft-full-spin-iter1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UCLA-AGI/zephyr-7b-sft-full-spin-iter1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-spin-iter1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UCLA-AGI/zephyr-7b-sft-full-spin-iter1
9257b6484010acf5eed7e77ff787264b49c1a923
62.856721
0
7
false
true
true
true
2024-01-05T04:16:15Z
false
65.870307
85.441147
60.945497
57.388938
76.637727
30.85671
false
Unbabel_TowerBase-7B-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Unbabel/TowerBase-7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Unbabel/TowerBase-7B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Unbabel__TowerBase-7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Unbabel/TowerBase-7B-v0.1
227253877d67620f45c7b45ff22ead1dc6e03e4f
49.108991
cc-by-nc-4.0
48
6
true
true
true
true
2024-01-12T11:18:08Z
false
51.023891
77.683728
43.477831
37.292514
72.059984
13.115997
false
Unbabel_TowerInstruct-7B-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Unbabel/TowerInstruct-7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Unbabel/TowerInstruct-7B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Unbabel__TowerInstruct-7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Unbabel/TowerInstruct-7B-v0.1
d97a456da8a218425b5171a906a7d9a0c5cd7b2f
52.390627
cc-by-nc-4.0
58
6
true
true
true
true
2024-01-12T11:18:27Z
false
55.460751
78.998208
46.884016
42.594705
73.954223
16.451857
false
Undi95_Amethyst-13B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/Amethyst-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/Amethyst-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Amethyst-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/Amethyst-13B
d4a85b1006f0b9439e64f0e7400533a7b867c24d
56.620175
cc-by-nc-4.0
9
13
true
true
true
true
2023-10-16T12:48:18Z
false
62.627986
83.170683
55.91015
52.427198
74.743489
10.841547
false
Undi95_Amethyst-13B-Mistral_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/Undi95/Amethyst-13B-Mistral" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/Amethyst-13B-Mistral</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Amethyst-13B-Mistral" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/Amethyst-13B-Mistral
4328809e568f01e3f0a05764e3bb58e901310415
56.620175
0
12
false
true
true
true
2023-10-16T12:54:17Z
false
62.627986
83.170683
55.91015
52.427198
74.743489
10.841547
false
Undi95_Borealis-10.7B-DPO_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/Borealis-10.7B-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/Borealis-10.7B-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Borealis-10.7B-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/Borealis-10.7B-DPO
9d6e34fa51cd3c4745a044fbb2bca91b1c9a9f5a
59.177419
cc-by-nc-4.0
6
10
true
true
true
true
2024-01-21T15:02:38Z
false
57.935154
81.208923
60.739515
46.368488
75.453828
33.358605
false
Undi95_C-Based-2x7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/C-Based-2x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/C-Based-2x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__C-Based-2x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/C-Based-2x7B
ae2914cb1fc547a441526e1eecd0ea139ec1adc5
66.468062
cc-by-nc-4.0
1
12
true
true
false
true
2024-03-29T08:12:30Z
false
65.52901
85.002987
64.589941
50.16483
81.057616
52.463988
false
Undi95_Clover3-17B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/Clover3-17B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/Clover3-17B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Clover3-17B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/Clover3-17B
428f6f58869426baae2c49442b207a15bc2da3cc
56.613037
cc-by-nc-4.0
10
16
true
true
true
true
2023-12-12T00:14:12Z
false
59.897611
81.179048
60.466811
40.721737
78.610892
18.802123
false
Undi95_CodeEngine_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/CodeEngine" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/CodeEngine</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__CodeEngine" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/CodeEngine
f57879831c39f2dcb656cb2c9e9ce5878e92bb44
52.683211
cc-by-nc-4.0
0
0
true
true
false
true
2023-10-16T12:48:18Z
false
58.361775
82.274447
54.178828
45.18228
74.585635
1.5163
false
Undi95_CreativityEngine_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/CreativityEngine" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/CreativityEngine</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__CreativityEngine" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/CreativityEngine
7870cc50b82b5cbebfa9935b6d73a9d20170299a
55.247407
cc-by-nc-4.0
0
0
true
true
false
true
2023-10-16T12:54:17Z
false
59.300341
82.42382
53.552655
52.463933
74.191002
9.552691
false
Undi95_Emerald-13B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/Emerald-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/Emerald-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Emerald-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/Emerald-13B
f7696299463d8ec402a4e1eb001f3a447f1c5552
56.89243
cc-by-nc-4.0
2
13
true
true
true
true
2023-10-16T12:46:18Z
false
62.286689
83.688508
55.695608
50.943651
75.927388
12.812737
false
Undi95_Emerhyst-20B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/Emerhyst-20B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/Emerhyst-20B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Emerhyst-20B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/Emerhyst-20B
e4c23af4f5dd88cb27d245e2bfc3b81db652632c
57.065211
cc-by-nc-4.0
44
19
true
true
true
true
2023-10-16T12:54:17Z
false
61.68942
84.983071
56.97927
54.162982
76.085241
8.491281
false
Undi95_LewdEngine_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/LewdEngine" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/LewdEngine</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__LewdEngine" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/LewdEngine
6e918ff9f563552af4ad66f4308f6d040e24af4b
54.884481
cc-by-nc-4.0
2
0
true
true
true
true
2023-09-09T10:52:17Z
false
60.494881
83.08106
54.842428
43.629332
74.901342
12.357847
false
Undi95_Llama-3-8B-Instruct-OAS_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/Llama-3-8B-Instruct-OAS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/Llama-3-8B-Instruct-OAS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Llama-3-8B-Instruct-OAS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/Llama-3-8B-Instruct-OAS
956e7eeb433c8ffc9ef12367fe4ce1bd99efbbca
66.49022
0
8
false
true
true
true
2024-05-02T10:56:54Z
false
61.68942
78.450508
67.033778
51.416139
75.453828
64.89765
false
Undi95_Llama-3-LewdPlay-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/Llama-3-LewdPlay-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/Llama-3-LewdPlay-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Llama-3-LewdPlay-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/Llama-3-LewdPlay-8B
18854f9d914cfe6c35537b3700d59f3dc5897b78
65.995007
cc-by-nc-4.0
9
8
true
true
false
true
2024-04-28T00:24:32Z
false
63.054608
80.711014
66.954412
50.121572
77.584846
57.543594
false
Undi95_Llama-3-LewdPlay-8B-evo_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/Llama-3-LewdPlay-8B-evo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/Llama-3-LewdPlay-8B-evo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Llama-3-LewdPlay-8B-evo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/Llama-3-LewdPlay-8B-evo
5c1a4a19391133501c7f3ce8dfa5a1cd90c851d6
67.444661
cc-by-nc-4.0
26
8
true
false
true
true
2024-04-30T01:54:33Z
false
62.542662
79.685322
67.869461
50.65574
75.453828
68.460955
false
Undi95_Llama-3-Unholy-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/Llama-3-Unholy-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/Llama-3-Unholy-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Llama-3-Unholy-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/Llama-3-Unholy-8B
1751bd3a64abeee08a9ac2243d4040ae15634533
67.248063
cc-by-nc-4.0
27
8
true
true
false
true
2024-04-20T12:59:01Z
false
62.030717
79.486158
67.33474
50.889165
75.059195
68.6884
false
Undi95_Llama2-13B-no_robots-alpaca-lora_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/Llama2-13B-no_robots-alpaca-lora" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/Llama2-13B-no_robots-alpaca-lora</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Llama2-13B-no_robots-alpaca-lora" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/Llama2-13B-no_robots-alpaca-lora
581aba329e607533c299746bb9eb4154a7aab139
52.768111
cc-by-nc-4.0
9
13
true
true
true
true
2023-11-15T02:17:52Z
false
58.87372
82.433778
53.105321
40.455598
75.295975
6.444276
false
Undi95_Llamix2-MLewd-4x13B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/Llamix2-MLewd-4x13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/Llamix2-MLewd-4x13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Llamix2-MLewd-4x13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/Llamix2-MLewd-4x13B
19961590ae95ccd9316b13c66098cd61b28a7d5a
61.597997
cc-by-nc-4.0
58
38
true
true
false
true
2023-12-15T17:43:30Z
false
61.006826
83.170683
56.322418
50.346966
75.374901
43.366187
false
Undi95_Llamix2-Xwin-MoE-4x13B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/Undi95/Llamix2-Xwin-MoE-4x13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/Llamix2-Xwin-MoE-4x13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Llamix2-Xwin-MoE-4x13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/Llamix2-Xwin-MoE-4x13B
220833f87c233684e8a4b0e03126ffcdffce5229
57.931415
0
38
false
true
false
true
2023-12-16T01:58:11Z
false
60.409556
82.961561
56.240183
39.632094
75.138122
33.206975
false
Undi95_MLewd-Chat-v2-13B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/MLewd-Chat-v2-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/MLewd-Chat-v2-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MLewd-Chat-v2-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/MLewd-Chat-v2-13B
f6181961a6a2f9ca534e1a8907b4a4459be6b6bd
57.234755
cc-by-nc-4.0
18
13
true
true
true
true
2023-10-16T12:48:18Z
false
61.860068
83.808006
57.000372
54.508076
75.769534
10.462472
false
Undi95_MLewd-L2-13B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/MLewd-L2-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/MLewd-L2-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MLewd-L2-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/MLewd-L2-13B
feb1fa71e0b24261d3ca428b4aed881dd31f166e
53.117428
cc-by-nc-4.0
4
13
true
true
true
true
2023-09-09T10:52:17Z
false
58.276451
82.324238
54.67034
48.664022
73.480663
1.288855
false
Undi95_MLewd-L2-Chat-13B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/MLewd-L2-Chat-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/MLewd-L2-Chat-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MLewd-L2-Chat-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/MLewd-L2-Chat-13B
6c66622a99c1bc73498aa6a15a59da825d875310
57.754504
cc-by-nc-4.0
31
13
true
true
true
true
2023-11-06T10:31:15Z
false
62.030717
84.186417
58.751023
52.835439
77.426993
11.296437
false
Undi95_MLewd-ReMM-L2-Chat-20B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/MLewd-ReMM-L2-Chat-20B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/MLewd-ReMM-L2-Chat-20B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MLewd-ReMM-L2-Chat-20B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/MLewd-ReMM-L2-Chat-20B
cda06630a1d8173541431e5ce8bc17dcfaa37e5e
58.491393
cc-by-nc-4.0
25
19
true
true
true
true
2023-10-16T12:46:18Z
false
62.457338
85.620394
59.133534
55.629517
77.190213
10.917362
false
Undi95_MLewd-ReMM-L2-Chat-20B-Inverted_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/MLewd-ReMM-L2-Chat-20B-Inverted" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/MLewd-ReMM-L2-Chat-20B-Inverted</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MLewd-ReMM-L2-Chat-20B-Inverted" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/MLewd-ReMM-L2-Chat-20B-Inverted
b5b501b4d23ec7ab24b827f79e48b2c67e548ddb
57.24715
cc-by-nc-4.0
5
19
true
true
true
true
2023-10-16T13:27:38Z
false
61.68942
85.321649
57.997213
53.765138
75.611681
9.097801
false
Undi95_MLewd-v2.4-13B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/MLewd-v2.4-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/MLewd-v2.4-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MLewd-v2.4-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/MLewd-v2.4-13B
6f6ec6024ee054020e49fd96f149919692848f0b
56.373266
cc-by-nc-4.0
41
13
true
true
true
true
2023-11-06T10:31:15Z
false
61.68942
83.827923
55.098135
53.337271
74.506709
9.780136
false
Undi95_MLewdBoros-L2-13B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/MLewdBoros-L2-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/MLewdBoros-L2-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MLewdBoros-L2-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/MLewdBoros-L2-13B
a3033ac5825662f1c66418d7543648dc76980185
56.514718
cc-by-nc-4.0
22
13
true
true
true
true
2023-10-16T13:19:55Z
false
62.542662
83.89763
56.565298
48.136107
76.953433
10.993177
false
Undi95_MM-ReMM-L2-20B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/MM-ReMM-L2-20B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/MM-ReMM-L2-20B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MM-ReMM-L2-20B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/MM-ReMM-L2-20B
37869800c15fb37d017ea83bb50fec6d6141f6ba
56.551585
cc-by-nc-4.0
3
19
true
true
true
true
2023-11-06T10:31:15Z
false
60.836177
85.182235
56.454092
53.334343
75.769534
7.733131
false
Undi95_MXLewd-L2-20B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/MXLewd-L2-20B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/MXLewd-L2-20B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MXLewd-L2-20B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/MXLewd-L2-20B
ac279478abd9ddb8d1f5adcc548be0287b963adf
57.428312
cc-by-nc-4.0
24
19
true
true
true
true
2023-10-16T12:54:17Z
false
63.225256
85.331607
57.363312
51.647094
76.085241
10.917362
false
Undi95_Meta-Llama-3-8B-hf_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/Meta-Llama-3-8B-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/Meta-Llama-3-8B-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Meta-Llama-3-8B-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/Meta-Llama-3-8B-hf
c1a8ab86a087f2af95cbc6fd8fde5a0ba7b799a9
62.354406
other
22
8
true
true
true
true
2024-04-18T22:26:34Z
false
59.215017
82.015535
66.494955
43.952265
77.111287
45.337377
false
Undi95_Miqu-70B-Alpaca-DPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/Miqu-70B-Alpaca-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/Miqu-70B-Alpaca-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Miqu-70B-Alpaca-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/Miqu-70B-Alpaca-DPO
f7ee9b9099cd518060e9e61ff7ae11a39428bd93
76.600324
6
70
false
true
true
true
2024-02-08T13:06:44Z
false
73.208191
88.597889
75.41051
69.435597
85.398579
67.551175
false
Undi95_Miqu-MS-70B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/Miqu-MS-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/Miqu-MS-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Miqu-MS-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/Miqu-MS-70B
2aa17f8d8aadc2c8bf2aed438a6714fe3dbd9794
76.740161
cc-by-nc-4.0
8
68
true
false
true
true
2024-04-01T21:39:43Z
false
73.293515
88.627763
75.481445
69.317894
85.714286
68.006065
false
Undi95_Mistral-11B-TestBench11_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/Undi95/Mistral-11B-TestBench11" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/Mistral-11B-TestBench11</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench11" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/Mistral-11B-TestBench11
9aae2b156b24557bb98e515f3a90c7865529d2e9
60.254226
0
10
false
true
true
true
2023-10-16T12:54:17Z
false
64.419795
83.927504
63.820726
56.679075
77.742699
14.935557
false
Undi95_Mistral-11B-TestBench9_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/Undi95/Mistral-11B-TestBench9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/Mistral-11B-TestBench9</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Mistral-11B-TestBench9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/Mistral-11B-TestBench9
4ff48527af8c3907129c06160c7f7b7b786a5a79
60.517134
0
10
false
true
true
true
2023-11-06T10:31:15Z
false
64.078498
84.236208
63.998418
56.188046
78.453039
16.148597
false
Undi95_Mistral-11B-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/Mistral-11B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/Mistral-11B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Mistral-11B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/Mistral-11B-v0.1
e9698271ea1ab340bacfd5ebf0d77108a6f18a90
58.045924
apache-2.0
15
10
true
true
true
true
2023-12-29T22:27:45Z
false
59.556314
81.16909
63.557209
40.668322
76.637727
26.686884
false
Undi95_Mixtral-4x7B-DPO-RPChat_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/Mixtral-4x7B-DPO-RPChat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/Mixtral-4x7B-DPO-RPChat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Mixtral-4x7B-DPO-RPChat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/Mixtral-4x7B-DPO-RPChat
406aeb5ce848dfefbca65d69022ce1de36f9fde4
65.884247
cc-by-nc-4.0
9
24
true
true
false
true
2023-12-15T00:20:15Z
false
64.590444
85.361482
63.567455
49.871036
78.768745
53.146323
false
Undi95_Mixtral-8x7B-MoE-RP-Story_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/Mixtral-8x7B-MoE-RP-Story" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/Mixtral-8x7B-MoE-RP-Story</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Mixtral-8x7B-MoE-RP-Story" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/Mixtral-8x7B-MoE-RP-Story
ce4a4e4ffec063a3e338b6ebc328365270b6c5f0
47.226419
cc-by-nc-4.0
38
46
true
true
false
true
2023-12-15T20:42:12Z
false
51.535836
69.996017
43.039266
41.531241
67.324388
9.931766
false
Undi95_Nous-Hermes-13B-Code_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/Nous-Hermes-13B-Code" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/Nous-Hermes-13B-Code</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Nous-Hermes-13B-Code" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/Nous-Hermes-13B-Code
5a45cb2a6442581ce32cc19c561c49cec1db4ebb
55.93325
cc-by-nc-4.0
7
13
true
true
false
true
2023-09-09T10:52:17Z
false
61.177474
83.210516
55.131668
50.555063
75.138122
10.386657
false
Undi95_OpenRP-13B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/OpenRP-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/OpenRP-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__OpenRP-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/OpenRP-13B
d11815287c51ef51485fb003f8f72773cf6f19a4
56.566961
cc-by-nc-4.0
3
13
true
true
true
true
2023-10-16T13:19:55Z
false
62.116041
82.603067
57.501251
48.286539
76.006314
12.888552
false
Undi95_PsyMedRP-v1-20B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/PsyMedRP-v1-20B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/PsyMedRP-v1-20B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__PsyMedRP-v1-20B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/PsyMedRP-v1-20B
78188101b6331d9e61ef80f0971d715de100b44a
57.541411
cc-by-nc-4.0
33
19
true
true
true
true
2024-02-16T03:59:10Z
false
60.494881
83.937463
56.684287
54.449676
74.822415
14.859742
false
Undi95_ReMM-L2-13B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/Undi95/ReMM-L2-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/ReMM-L2-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-L2-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/ReMM-L2-13B
c4710577003a23ca8e9040d16dfb8f3e9bc5d636
54.057399
0
13
false
true
true
true
2023-09-09T10:52:17Z
false
59.726962
83.100976
54.114627
49.938332
74.506709
2.956785
false
Undi95_ReMM-L2-13B-PIPPA_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/ReMM-L2-13B-PIPPA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/ReMM-L2-13B-PIPPA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-L2-13B-PIPPA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/ReMM-L2-13B-PIPPA
79e711178c6881496ae1f5635b08bc193f370709
54.057888
cc-by-nc-4.0
1
13
true
true
true
true
2023-10-16T12:46:18Z
false
59.726962
83.120892
54.100799
49.935182
74.506709
2.956785
false
Undi95_ReMM-Mistral-13B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/Undi95/ReMM-Mistral-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/ReMM-Mistral-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-Mistral-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/ReMM-Mistral-13B
a5ef9385d9430a81778183d71b58eb2b869d6a7e
56.888247
0
12
false
true
true
true
2023-10-16T12:54:17Z
false
62.201365
83.817965
55.430494
53.318361
74.506709
12.054587
false
Undi95_ReMM-SLERP-L2-13B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/ReMM-SLERP-L2-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/ReMM-SLERP-L2-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-SLERP-L2-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/ReMM-SLERP-L2-13B
27baccf242bc1dc34fc39661a40bbf867cbea8b5
56.029548
cc-by-nc-4.0
19
13
true
true
true
true
2023-10-16T13:19:55Z
false
60.921502
83.559052
55.332433
51.973639
75.217048
9.173616
false
Undi95_ReMM-v2-L2-13B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/ReMM-v2-L2-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/ReMM-v2-L2-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-v2-L2-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/ReMM-v2-L2-13B
bc42c77f88482c37c72c85c66135e99972bbca1b
56.988698
cc-by-nc-4.0
3
13
true
true
true
true
2023-10-16T12:46:18Z
false
61.945392
83.997212
56.138039
50.811273
75.848461
13.191812
false
Undi95_ReMM-v2.1-L2-13B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/ReMM-v2.1-L2-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/ReMM-v2.1-L2-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-v2.1-L2-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/ReMM-v2.1-L2-13B
e6b5ac97f74355cb281a621261debe5720fb4da2
56.711397
cc-by-nc-4.0
1
13
true
true
true
true
2023-10-16T12:54:17Z
false
61.433447
83.917546
55.95486
50.29822
75.927388
12.736922
false
Undi95_ReMM-v2.2-L2-13B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/ReMM-v2.2-L2-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/ReMM-v2.2-L2-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-v2.2-L2-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/ReMM-v2.2-L2-13B
d55031fbcd41d749bc0c0ffbcd85636718d373b6
57.104151
cc-by-nc-4.0
5
13
true
true
true
true
2023-10-16T12:46:18Z
false
61.262799
84.156543
56.216943
51.351167
75.611681
14.025777
false
Undi95_U-Amethyst-20B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/U-Amethyst-20B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/U-Amethyst-20B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__U-Amethyst-20B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/U-Amethyst-20B
c0cbe0b3c88041bb6beef27dbe85146af8dddec9
55.648123
cc-by-nc-4.0
28
19
true
true
true
true
2023-11-06T10:31:15Z
false
62.201365
83.110934
55.877162
53.201222
74.191002
5.307051
false
Undi95_UndiMix-v1-13b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/UndiMix-v1-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/UndiMix-v1-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__UndiMix-v1-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/UndiMix-v1-13b
fd311f52648825d6988d2f945918468ceb32289f
55.498109
cc-by-nc-4.0
0
13
true
true
true
true
2023-09-09T10:52:17Z
false
59.47099
82.453694
55.826078
49.776483
75.453828
10.007582
false
Undi95_UndiMix-v4-13B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/UndiMix-v4-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/UndiMix-v4-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__UndiMix-v4-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/UndiMix-v4-13B
6dd97c74cfe1d22432d5c993814e230f333ba401
56.927417
cc-by-nc-4.0
4
13
true
true
true
true
2023-10-16T12:54:17Z
false
61.945392
83.877714
56.899514
48.955196
76.164167
13.722517
false
Undi95_Unholy-v1-12L-13B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/Unholy-v1-12L-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/Unholy-v1-12L-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Unholy-v1-12L-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/Unholy-v1-12L-13B
ee25c078f08b0812d82597afa3f5e877c19a5c83
57.471347
cc-by-nc-4.0
37
13
true
true
true
true
2023-10-16T13:27:38Z
false
63.566553
83.748257
58.081365
51.093776
77.26914
11.068992
false
Undi95_X-MythoChronos-13B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Undi95/X-MythoChronos-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Undi95/X-MythoChronos-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__X-MythoChronos-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Undi95/X-MythoChronos-13B
8d302741466512f0621a594fce6bf5b8125c8d4c
58.428345
cc-by-nc-4.0
15
13
true
true
true
true
2023-12-06T17:52:18Z
false
59.726962
83.389763
56.503963
53.549649
74.427782
22.971948
false
UnicomLLM_Unichat-llama3-Chinese-8B-28K_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/UnicomLLM/Unichat-llama3-Chinese-8B-28K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">UnicomLLM/Unichat-llama3-Chinese-8B-28K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_UnicomLLM__Unichat-llama3-Chinese-8B-28K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
UnicomLLM/Unichat-llama3-Chinese-8B-28K
076a73e96b69e3079d7eff5e78c8f9e3e0757feb
55.774084
apache-2.0
17
8
true
true
true
true
2024-05-12T23:45:39Z
false
56.740614
76.52858
55.267886
47.008348
69.455406
29.643669
false
VAGOsolutions_Llama-3-SauerkrautLM-8b-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/Llama-3-SauerkrautLM-8b-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/Llama-3-SauerkrautLM-8b-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_VAGOsolutions__Llama-3-SauerkrautLM-8b-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/Llama-3-SauerkrautLM-8b-Instruct
363937b547b6f77738a252bfdd6ce9e11635a9a7
73.742473
other
47
8
true
true
true
true
2024-04-22T23:14:34Z
false
73.720137
89.414459
68.06694
66.248265
80.031571
64.973465
false
VAGOsolutions_SauerkrautLM-14b-MoE-LaserChat_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-14b-MoE-LaserChat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-14b-MoE-LaserChat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_VAGOsolutions__SauerkrautLM-14b-MoE-LaserChat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-14b-MoE-LaserChat
e3d7c73110dd6edd9e96b1f3d9b0dea91d83ce2d
71.596978
apache-2.0
6
12
true
true
false
true
2024-03-08T00:18:17Z
false
66.723549
84.883489
65.170112
57.64161
81.925809
73.237301
false
VAGOsolutions_SauerkrautLM-7b-HerO_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-7b-HerO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-7b-HerO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_VAGOsolutions__SauerkrautLM-7b-HerO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-7b-HerO
0aeb810af28e2910a92b929c21b931a5c06073de
64.485373
apache-2.0
31
7
true
false
true
true
2023-11-27T03:35:27Z
false
63.225256
83.519219
63.295615
49.218277
78.374112
49.279757
false
VAGOsolutions_SauerkrautLM-7b-LaserChat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-7b-LaserChat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-7b-LaserChat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_VAGOsolutions__SauerkrautLM-7b-LaserChat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-7b-LaserChat
03b8cef6f31e2a6816186d1bddadd938c19f1cd7
70.318409
apache-2.0
8
7
true
true
true
true
2024-02-07T01:44:22Z
false
67.576792
83.578968
64.930838
56.08406
80.899763
68.84003
false
VAGOsolutions_SauerkrautLM-Gemma-2b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-Gemma-2b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-Gemma-2b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_VAGOsolutions__SauerkrautLM-Gemma-2b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-Gemma-2b
29075d62fc6ffe23c3c517aa9afe5c9fc1621b81
48.920017
other
7
2
true
true
true
true
2024-03-07T01:39:20Z
false
48.720137
71.410078
42.900103
35.771282
67.955801
26.762699
false
VAGOsolutions_SauerkrautLM-Gemma-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-Gemma-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-Gemma-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_VAGOsolutions__SauerkrautLM-Gemma-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-Gemma-7b
87cf83507c53dc0a41f8ecd0c961235b42c20ade
67.827954
other
13
8
true
true
true
true
2024-02-29T18:39:23Z
false
59.982935
81.905995
63.761258
60.995201
76.637727
63.68461
false
VAGOsolutions_SauerkrautLM-Mixtral-8x7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-Mixtral-8x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-Mixtral-8x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_VAGOsolutions__SauerkrautLM-Mixtral-8x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-Mixtral-8x7B
82dc0ab70090085b4271f0f317f667f180db9872
67.800026
apache-2.0
11
46
true
true
false
true
2023-12-20T00:42:11Z
false
68.856655
86.008763
66.694714
57.198879
80.50513
47.536012
false
VAGOsolutions_SauerkrautLM-Mixtral-8x7B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_VAGOsolutions__SauerkrautLM-Mixtral-8x7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct
330eb185920d6a470b265a4b31217c60e810fb3e
72.728392
apache-2.0
21
46
true
true
false
true
2023-12-20T00:18:50Z
false
70.56314
87.741486
71.07702
65.718415
81.452249
59.818044
false
VAGOsolutions_SauerkrautLM-Mixtral-8x7B-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_VAGOsolutions__SauerkrautLM-Mixtral-8x7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-Mixtral-8x7B-Instruct
330eb185920d6a470b265a4b31217c60e810fb3e
72.88757
apache-2.0
21
46
true
true
false
true
2023-12-27T02:26:28Z
false
70.477816
87.751444
71.365851
65.711201
81.21547
60.803639
false
VAGOsolutions_SauerkrautLM-Qwen-32b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-Qwen-32b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-Qwen-32b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_VAGOsolutions__SauerkrautLM-Qwen-32b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-Qwen-32b
f5306490ea218e74a88e7431fe2f4c2d7f16b721
67.387604
other
4
32
true
true
true
true
2024-04-13T00:56:19Z
false
58.788396
82.154949
74.191555
61.039539
81.373323
46.777862
false
VAGOsolutions_SauerkrautLM-SOLAR-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/VAGOsolutions/SauerkrautLM-SOLAR-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAGOsolutions/SauerkrautLM-SOLAR-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_VAGOsolutions__SauerkrautLM-SOLAR-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAGOsolutions/SauerkrautLM-SOLAR-Instruct
8b9615124a0bcadd7fa984eaadd066da0fb4fbae
74.205695
cc-by-nc-4.0
44
10
true
true
true
true
2023-12-20T01:37:23Z
false
70.819113
88.627763
66.19741
71.94604
83.504341
64.1395
false
VAIBHAV22334455_JARVIS_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/VAIBHAV22334455/JARVIS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VAIBHAV22334455/JARVIS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_VAIBHAV22334455__JARVIS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VAIBHAV22334455/JARVIS
28091aa912d17a231f59a18a286f289928c098fb
35.783921
apache-2.0
7
2
true
true
true
true
2024-03-27T12:19:47Z
false
32.081911
56.861183
27.14723
37.333909
60.142068
1.137225
false
VMware_open-llama-0.7T-7B-open-instruct-v1.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/VMware/open-llama-0.7T-7B-open-instruct-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VMware/open-llama-0.7T-7B-open-instruct-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_VMware__open-llama-0.7T-7B-open-instruct-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VMware/open-llama-0.7T-7B-open-instruct-v1.1
75741b55ad462330e3498d1506f438f835152177
41.112579
cc
4
7
true
true
true
true
2023-09-09T10:52:17Z
false
46.672355
67.665804
28.545424
37.603591
65.43015
0.75815
false
VMware_open-llama-7b-open-instruct_float16
float16
?
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/VMware/open-llama-7b-open-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VMware/open-llama-7b-open-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_VMware__open-llama-7b-open-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VMware/open-llama-7b-open-instruct
fdf9f034163cce67e04d55172155f0e07b1b19a0
42.590478
cc-by-sa-3.0
26
7
true
true
true
true
2023-10-16T12:48:18Z
false
49.744027
73.670584
31.522012
34.645392
65.43015
0.530705
false
Vmware_open-llama-7b-v2-open-instruct_float16
float16
?
Original
Unknown
<a target="_blank" href="https://huggingface.co/Vmware/open-llama-7b-v2-open-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Vmware/open-llama-7b-v2-open-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Vmware__open-llama-7b-v2-open-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Vmware/open-llama-7b-v2-open-instruct
b8fbe09571a71603ab517fe897a1281005060b62
42.753694
0
0
false
true
true
true
false
39.761092
70.314678
35.159473
39.531875
64.325178
7.429871
false
ValiantLabs_Esper-70b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Esper-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Esper-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ValiantLabs__Esper-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Esper-70b
13c5bb97ed6c5faaaa2e2a57fbb60aaff61a0f4c
59.026325
apache-2.0
2
68
true
true
true
true
2024-03-14T18:02:30Z
false
56.484642
77.723561
55.909745
45.980111
73.480663
44.579227
false
ValiantLabs_Fireplace-13b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Fireplace-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Fireplace-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ValiantLabs__Fireplace-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Fireplace-13b
1c37006534c4352f19c0b7ee857ed00601644068
50.341374
apache-2.0
2
13
true
true
true
true
2024-01-18T21:03:21Z
false
47.696246
69.607648
43.560687
48.240026
67.166535
25.777104
false
ValiantLabs_Fireplace-34b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Fireplace-34b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Fireplace-34b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ValiantLabs__Fireplace-34b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Fireplace-34b
58c8df51a5963cd206301461edb68fa86ba059ed
67.438181
other
2
34
true
true
true
true
2024-03-26T15:53:27Z
false
71.245734
82.722565
47.00603
65.112667
79.558011
58.984079
false
ValiantLabs_Llama3-70B-Fireplace_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/Llama3-70B-Fireplace" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/Llama3-70B-Fireplace</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ValiantLabs__Llama3-70B-Fireplace" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/Llama3-70B-Fireplace
220079e4115733991eb19c30d5480db9696a665e
76.748515
llama3
3
70
true
true
true
true
2024-05-09T19:44:54Z
false
70.648464
85.002987
78.970165
59.76722
82.478295
83.623958
false
ValiantLabs_ShiningValiant_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/ShiningValiant" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/ShiningValiant</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ValiantLabs__ShiningValiant" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/ShiningValiant
7c4401cddc462c5f35d8984c90e293faee37bf8e
70.339767
llama2
73
68
true
true
true
true
2023-12-30T17:36:33Z
false
68.686007
87.313284
69.643952
55.777421
84.135754
56.482183
false
ValiantLabs_ShiningValiantXS_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ValiantLabs/ShiningValiantXS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ValiantLabs/ShiningValiantXS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ValiantLabs__ShiningValiantXS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ValiantLabs/ShiningValiantXS
8c1f86bd2e646408eed2ed3a2634b38ea4e5c599
59.555952
llama2
11
13
true
true
true
true
2024-01-12T06:32:29Z
false
58.959044
81.925911
56.753723
48.702659
76.953433
34.04094
false
Vasanth_Beast-Soul_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Vasanth/Beast-Soul" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Vasanth/Beast-Soul</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Vasanth__Beast-Soul" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Vasanth/Beast-Soul
700aacf29bde13dfef2a5f15c5a5d6627c73d80d
74.367604
apache-2.0
0
7
true
false
true
true
2024-01-22T00:03:15Z
false
72.525597
88.149771
64.764397
66.756671
83.425414
70.583776
false
Vasanth_Beast-Soul-new_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Vasanth/Beast-Soul-new" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Vasanth/Beast-Soul-new</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Vasanth__Beast-Soul-new" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Vasanth/Beast-Soul-new
d4a6d05f84f82b0a6ad625dd2473115ca972c1db
74.764836
apache-2.0
0
7
true
false
true
true
2024-02-03T06:20:26Z
false
73.122867
88.348934
64.74453
67.382147
85.240726
69.74981
false
Vasanth_Valor_Macaroni_moe_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/Vasanth/Valor_Macaroni_moe" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Vasanth/Valor_Macaroni_moe</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Vasanth__Valor_Macaroni_moe" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Vasanth/Valor_Macaroni_moe
dbd8fcc7b2987cc3a1802561f63e483e8871aadb
73.200071
apache-2.0
0
12
true
false
false
true
2024-01-21T09:23:35Z
false
70.307167
86.616212
64.57443
64.64988
82.241515
70.811221
false
Vezora_Mistral-22B-v0.1_float16
float16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Vezora/Mistral-22B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Vezora/Mistral-22B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Vezora__Mistral-22B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Vezora/Mistral-22B-v0.1
de4570db2a99c39dde9572896f7a64c5d1be7717
49.936051
apache-2.0
150
22
true
true
false
true
2024-04-11T10:05:14Z
false
49.40273
72.92372
48.750575
47.348402
74.822415
6.368461
false
Vezora_Mistral-22B-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Vezora/Mistral-22B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Vezora/Mistral-22B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Vezora__Mistral-22B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Vezora/Mistral-22B-v0.1
de4570db2a99c39dde9572896f7a64c5d1be7717
49.79602
apache-2.0
150
22
true
true
false
true
2024-04-11T20:32:11Z
false
49.146758
72.81418
48.550566
47.313634
74.506709
6.444276
false
Vezora_Mistral-22B-v0.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Vezora/Mistral-22B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Vezora/Mistral-22B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Vezora__Mistral-22B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Vezora/Mistral-22B-v0.2
905cb0b7ffa14413ba4553a1c45f1c8b555bd395
56.27343
apache-2.0
108
22
true
true
false
true
2024-04-13T12:54:02Z
false
52.389078
78.629755
54.631795
48.835639
76.164167
26.990144
false
VitalContribution_Evangelion-7B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/VitalContribution/Evangelion-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">VitalContribution/Evangelion-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_VitalContribution__Evangelion-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
VitalContribution/Evangelion-7B
7e3fdb60969ef0f7219cbcb9b05f7d1537af1c8d
71.710289
apache-2.0
4
7
true
true
true
true
2024-01-12T17:57:10Z
false
68.94198
86.446923
63.968832
64.0067
79.952644
66.944655
false
Voicelab_trurl-2-13b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Voicelab/trurl-2-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Voicelab/trurl-2-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_Voicelab__trurl-2-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Voicelab/trurl-2-13b
c8b2bbc7a570a9ea67928674695a4e7dff017d66
58.732812
null
28
13
false
true
true
true
2023-09-09T10:52:17Z
false
60.068259
80.233021
78.589981
45.949382
74.743489
12.812737
false