eval_name
stringlengths
9
97
Precision
stringclasses
5 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
3 values
Architecture
stringclasses
53 values
Model
stringlengths
355
611
fullname
stringlengths
4
89
Model sha
stringlengths
0
40
Average ⬆️
float64
27
81.3
Hub License
stringclasses
35 values
Hub ❤️
int64
0
4.88k
#Params (B)
int64
0
238
Available on the hub
bool
2 classes
Merged
bool
2 classes
MoE
bool
2 classes
Flagged
bool
1 class
date
stringlengths
0
26
Chat Template
bool
2 classes
ARC
float64
19.7
87.5
HellaSwag
float64
20.7
92.8
MMLU
float64
17.8
89.4
TruthfulQA
float64
27.9
82.3
Winogrande
float64
47.2
91.5
GSM8K
float64
0
88.2
Maintainers Choice
bool
2 classes
grimjim_Mistral-7B-Instruct-demi-merge-v0.2-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Mistral-7B-Instruct-demi-merge-v0.2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Mistral-7B-Instruct-demi-merge-v0.2-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__Mistral-7B-Instruct-demi-merge-v0.2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Mistral-7B-Instruct-demi-merge-v0.2-7B
db786274df9d55902a7c5e98a134e63deee1f558
65.713522
apache-2.0
0
7
true
false
true
true
2024-03-27T12:32:43Z
false
63.90785
84.893448
63.69461
55.262355
78.531965
47.990902
false
grimjim_Mistral-7B-Instruct-demi-merge-v0.3-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Mistral-7B-Instruct-demi-merge-v0.3-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Mistral-7B-Instruct-demi-merge-v0.3-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__Mistral-7B-Instruct-demi-merge-v0.3-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Mistral-7B-Instruct-demi-merge-v0.3-7B
b361f0ee28ef597f13ad7dcd8adcf75d97293d9f
63.818018
apache-2.0
0
7
true
false
true
true
2024-05-29T17:52:17Z
false
62.286689
84.266082
63.732219
49.948148
79.005525
43.669447
false
grimjim_Mistral-Starling-merge-trial1-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Mistral-Starling-merge-trial1-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Mistral-Starling-merge-trial1-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__Mistral-Starling-merge-trial1-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Mistral-Starling-merge-trial1-7B
8781341908ff63afe7a31e8692ae964cfb75cf38
67.489407
apache-2.0
0
7
true
false
true
true
2024-03-29T16:40:12Z
false
66.12628
84.674368
64.124826
53.178397
80.426204
56.406368
false
grimjim_Mistral-Starling-merge-trial3-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/Mistral-Starling-merge-trial3-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/Mistral-Starling-merge-trial3-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__Mistral-Starling-merge-trial3-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/Mistral-Starling-merge-trial3-7B
95cc5aafb8b12ae31b5fd5e68a0e9e3e16c7546a
66.903467
apache-2.0
0
7
true
false
true
true
2024-03-30T01:05:22Z
false
66.552901
84.813782
64.1799
52.847958
80.031571
52.994693
false
grimjim_cuckoo-starling-32k-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/cuckoo-starling-32k-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/cuckoo-starling-32k-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__cuckoo-starling-32k-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/cuckoo-starling-32k-7B
f1efb535ed146a6516f38cdacf211ef4b24f2a96
69.92861
cc-by-nc-4.0
2
7
true
false
true
true
2024-05-19T13:13:05Z
false
66.808874
85.96893
64.878429
59.030101
80.110497
62.774829
false
grimjim_cuckoo-starling-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/cuckoo-starling-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/cuckoo-starling-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__cuckoo-starling-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/cuckoo-starling-7B
ce5731a8c09b8e3bd7cb6fa4e9dbab0ff5986861
69.92861
0
7
false
true
true
true
2024-05-16T01:56:07Z
false
66.808874
85.96893
64.878429
59.030101
80.110497
62.774829
false
grimjim_fireblossom-32K-7B_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/fireblossom-32K-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/fireblossom-32K-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__fireblossom-32K-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/fireblossom-32K-7B
09ab11c11ebbdaf820d937ddd2553251f62a1080
70.664918
cc-by-nc-4.0
2
7
true
false
true
true
2024-04-14T14:59:50Z
false
69.96587
86.476797
63.44642
63.122918
79.794791
61.182714
false
grimjim_kukulemon-32K-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/kukulemon-32K-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/kukulemon-32K-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__kukulemon-32K-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/kukulemon-32K-7B
4ad0372fe633f916fb2b9268c6262022bc262567
65.796022
cc-by-nc-4.0
1
7
true
false
true
true
2024-05-30T12:24:54Z
false
65.955631
85.321649
62.105029
55.180437
78.374112
47.839272
false
grimjim_kukulemon-7B_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/kukulemon-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/kukulemon-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__kukulemon-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/kukulemon-7B
80098e3132e20702cd33c049c47cfee6a26fa32c
70.200321
cc-by-nc-4.0
9
7
true
false
true
true
2024-03-12T05:06:07Z
false
67.74744
86.098387
65.090524
61.992185
79.242305
61.031084
false
grimjim_kukulemon-spiked-9B_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/kukulemon-spiked-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/kukulemon-spiked-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__kukulemon-spiked-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/kukulemon-spiked-9B
3845c8ff5649f54dd40ff2d03e25074ce286b511
68.421227
cc-by-nc-4.0
2
8
true
false
true
true
2024-04-08T12:40:12Z
false
66.040956
85.879307
65.015289
64.238896
79.163378
50.189538
false
grimjim_kukulemon-v3-soul_mix-32k-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/kukulemon-v3-soul_mix-32k-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/kukulemon-v3-soul_mix-32k-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__kukulemon-v3-soul_mix-32k-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/kukulemon-v3-soul_mix-32k-7B
061160f1f3635986b5b96dcc26ca0f2f9ea24b57
65.885226
cc-by-nc-4.0
3
7
true
false
true
true
2024-06-06T22:46:55Z
false
66.467577
85.251942
62.08185
55.293491
78.453039
47.763457
false
grimjim_kuno-kunoichi-v1-DPO-v2-SLERP-7B_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/kuno-kunoichi-v1-DPO-v2-SLERP-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/kuno-kunoichi-v1-DPO-v2-SLERP-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__kuno-kunoichi-v1-DPO-v2-SLERP-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/kuno-kunoichi-v1-DPO-v2-SLERP-7B
88d5b63f7d62baeab4704b72ed656aa8bee3a2fb
72.331731
cc-by-nc-4.0
3
7
true
false
true
true
2024-03-10T15:48:51Z
false
69.112628
87.333201
64.803853
65.123733
80.899763
66.71721
false
grimjim_kunoichi-lemon-royale-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/kunoichi-lemon-royale-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/kunoichi-lemon-royale-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__kunoichi-lemon-royale-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/kunoichi-lemon-royale-7B
9b4c74995db06c1132379a6551b99aca7555f0fb
68.537517
cc-by-nc-4.0
2
7
true
false
true
true
2024-03-20T03:48:28Z
false
66.723549
85.490938
64.065932
60.193341
79.558011
55.193328
false
grimjim_kunoichi-lemon-royale-v2-32K-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/kunoichi-lemon-royale-v2-32K-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/kunoichi-lemon-royale-v2-32K-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__kunoichi-lemon-royale-v2-32K-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/kunoichi-lemon-royale-v2-32K-7B
275028958a12bcdcf53a5003720514a8252dc931
67.51867
cc-by-nc-4.0
4
7
true
false
true
true
2024-05-19T13:20:27Z
false
66.552901
85.251942
62.932785
58.234992
78.689818
53.449583
false
grimjim_kunoichi-lemon-royale-v3-32K-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/kunoichi-lemon-royale-v3-32K-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/kunoichi-lemon-royale-v3-32K-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__kunoichi-lemon-royale-v3-32K-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/kunoichi-lemon-royale-v3-32K-7B
7dddf6b597afae7addb9bf64116dd8ed5aaafdd4
67.643096
cc-by-nc-4.0
3
7
true
false
true
true
2024-06-06T22:47:30Z
false
66.382253
85.271858
62.881995
58.345994
78.768745
54.207733
false
grimjim_llama-3-experiment-v1-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/llama-3-experiment-v1-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/llama-3-experiment-v1-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__llama-3-experiment-v1-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/llama-3-experiment-v1-9B
db86ef18da77430b8830c6cebec5a8680146b311
66.40844
llama3
4
8
true
false
true
true
2024-04-28T19:56:56Z
false
60.665529
78.560048
66.709906
50.704525
75.927388
65.883245
false
grimjim_llama-3-merge-avalon-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/llama-3-merge-avalon-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/llama-3-merge-avalon-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__llama-3-merge-avalon-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/llama-3-merge-avalon-8B
e10c6a3d61cdaa9bb1952a7556e2d09a59a7dca1
69.147755
cc-by-nc-4.0
0
8
true
false
true
true
2024-05-11T03:24:29Z
false
65.102389
82.473611
67.745589
55.221295
76.716654
67.62699
false
grimjim_llama-3-merge-pp-instruct-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/llama-3-merge-pp-instruct-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/llama-3-merge-pp-instruct-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__llama-3-merge-pp-instruct-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/llama-3-merge-pp-instruct-8B
0714ee023cd4c90541f77479f8895673f10de8e5
67.828699
cc-by-nc-4.0
0
8
true
false
true
true
2024-05-07T14:16:52Z
false
62.627986
80.412268
67.77139
52.236433
75.690608
68.23351
false
grimjim_llama-3-merge-virt-req-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/llama-3-merge-virt-req-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/llama-3-merge-virt-req-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__llama-3-merge-virt-req-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/llama-3-merge-virt-req-8B
7d5bce592db79052558fcac3c2fa6871d21a88d1
66.997818
llama3
2
8
true
false
true
true
2024-05-10T02:10:24Z
false
61.860068
80.521808
66.7343
52.352067
75.848461
64.670205
false
grimjim_madwind-wizard-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/madwind-wizard-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/madwind-wizard-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__madwind-wizard-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/madwind-wizard-7B
b41f5f846aa8e299b757acd4eb530655b822f917
63.002366
cc-by-nc-4.0
4
7
true
false
true
true
2024-05-06T23:42:35Z
false
63.395904
84.505079
63.31345
45.065686
78.216259
43.517817
false
grimjim_zephyr-beta-wizardLM-2-merge-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/zephyr-beta-wizardLM-2-merge-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/zephyr-beta-wizardLM-2-merge-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__zephyr-beta-wizardLM-2-merge-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/zephyr-beta-wizardLM-2-merge-7B
a59ffc24c89345a5014a79dd82a8b3c6a61c4fcc
65.939603
apache-2.0
1
7
true
false
true
true
2024-04-17T11:40:23Z
false
64.675768
85.341565
63.415149
58.376245
77.505919
46.322972
false
grimjim_zephyr-wizard-kuno-royale-BF16-merge-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/grimjim/zephyr-wizard-kuno-royale-BF16-merge-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">grimjim/zephyr-wizard-kuno-royale-BF16-merge-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__zephyr-wizard-kuno-royale-BF16-merge-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
grimjim/zephyr-wizard-kuno-royale-BF16-merge-7B
79d37e97d8f221319cd3928d51db5afa2400c7ce
71.567352
cc-by-nc-4.0
0
7
true
false
true
true
2024-04-19T17:22:23Z
false
68.771331
86.895041
64.85524
65.545393
80.031571
63.305534
false
guardrail_llama-2-7b-guanaco-instruct-sharded_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/guardrail/llama-2-7b-guanaco-instruct-sharded" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">guardrail/llama-2-7b-guanaco-instruct-sharded</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_guardrail__llama-2-7b-guanaco-instruct-sharded" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
guardrail/llama-2-7b-guanaco-instruct-sharded
fc7a3abbc3b9a9b3e163ef3c4844307ac270fca7
50.575069
apache-2.0
8
6
true
true
true
true
2023-10-16T12:46:18Z
false
53.754266
78.689504
46.654204
43.931022
72.61247
7.808946
false
gywy_llama2-13b-chinese-v2_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gywy/llama2-13b-chinese-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gywy/llama2-13b-chinese-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_gywy__llama2-13b-chinese-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gywy/llama2-13b-chinese-v2
8f6b11ca4344ac230d6b55defa4e04e60a39f9b5
49.584974
7
13
false
true
true
true
2023-10-16T12:48:18Z
false
53.924915
74.636527
49.736199
45.427145
71.586425
2.198635
false
h2m_mhm-7b-v1.3_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/h2m/mhm-7b-v1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2m/mhm-7b-v1.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_h2m__mhm-7b-v1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2m/mhm-7b-v1.3
0e8363818fdbdc8bacb1467e019f49fa8a9f4329
47.291189
apache-2.0
2
7
true
false
false
true
2024-01-14T20:21:57Z
false
47.525597
65.305716
45.7429
46.220533
62.273086
16.679303
false
h2m_mhm-7b-v1.3-DPO-1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/h2m/mhm-7b-v1.3-DPO-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2m/mhm-7b-v1.3-DPO-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_h2m__mhm-7b-v1.3-DPO-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2m/mhm-7b-v1.3-DPO-1
6ebd98fba486278e82be038bdc4b410c6bbd9c2d
47.766895
apache-2.0
1
7
true
true
true
true
2024-01-17T05:16:48Z
false
49.573379
68.103963
45.764333
45.884572
62.036306
15.238817
false
h2m_mhm-8x7B-FrankenMoE-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/h2m/mhm-8x7B-FrankenMoE-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2m/mhm-8x7B-FrankenMoE-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_h2m__mhm-8x7B-FrankenMoE-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2m/mhm-8x7B-FrankenMoE-v1.0
5aeee76977588d88d3faca8340c582c82cc598ce
74.0057
apache-2.0
2
46
true
false
false
true
2024-01-17T15:09:32Z
false
70.904437
87.751444
64.699673
67.10454
82.004736
71.569371
false
h2oai_h2o-danube-1.8b-base_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/h2oai/h2o-danube-1.8b-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2oai/h2o-danube-1.8b-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2o-danube-1.8b-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2oai/h2o-danube-1.8b-base
3201996d3a41b4a485582164db42ca58d51051aa
39.12018
apache-2.0
43
1
true
true
true
true
2024-01-31T08:27:34Z
false
39.419795
69.577773
25.935741
33.864253
64.483031
1.440485
false
h2oai_h2o-danube-1.8b-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/h2oai/h2o-danube-1.8b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2oai/h2o-danube-1.8b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2o-danube-1.8b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2oai/h2o-danube-1.8b-chat
e2a18423798fa43e6c9935073d9c24c0cd901c6d
44.491044
apache-2.0
53
1
true
true
true
true
2024-01-31T08:28:40Z
false
41.12628
68.064131
33.405747
41.637248
65.351223
17.361638
false
h2oai_h2o-danube-1.8b-sft_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/h2oai/h2o-danube-1.8b-sft" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2oai/h2o-danube-1.8b-sft</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2o-danube-1.8b-sft" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2oai/h2o-danube-1.8b-sft
251a6e5b0749135c6109532734b803d15dd49b7a
43.679383
apache-2.0
11
1
true
true
true
true
2024-01-31T08:28:23Z
false
40.187713
67.337184
33.747864
40.286197
65.43015
15.087187
false
h2oai_h2o-danube2-1.8b-base_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/h2oai/h2o-danube2-1.8b-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2oai/h2o-danube2-1.8b-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2o-danube2-1.8b-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2oai/h2o-danube2-1.8b-base
a038fb5d19a0373b5158f3c014d37ca9f82882ba
48.723599
apache-2.0
40
1
true
true
true
true
2024-04-05T13:49:51Z
false
43.34471
72.953595
40.200667
38.012598
68.034728
29.795299
false
h2oai_h2o-danube2-1.8b-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/h2oai/h2o-danube2-1.8b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2oai/h2o-danube2-1.8b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2o-danube2-1.8b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2oai/h2o-danube2-1.8b-chat
2896625bf7e5034fe0356fc49d2db360f2175f6b
49.258258
apache-2.0
51
1
true
true
true
true
2024-04-05T13:49:27Z
false
43.686007
73.899622
37.96823
40.539322
68.90292
30.55345
false
h2oai_h2o-danube2-1.8b-chat_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/h2oai/h2o-danube2-1.8b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2oai/h2o-danube2-1.8b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2o-danube2-1.8b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2oai/h2o-danube2-1.8b-chat
4000d05999b447ae86529ce6e823bb2f469c2a41
48.89236
apache-2.0
51
1
true
true
true
true
2024-04-18T00:27:20Z
false
43.686007
73.949412
38.021072
40.537531
68.350434
28.809704
false
h2oai_h2o-danube2-1.8b-sft_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/h2oai/h2o-danube2-1.8b-sft" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2oai/h2o-danube2-1.8b-sft</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2o-danube2-1.8b-sft" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2oai/h2o-danube2-1.8b-sft
9013ff4d82296aafe318b83ac42c3b9caede1d6c
47.35989
apache-2.0
5
1
true
true
true
true
2024-04-05T13:50:08Z
false
42.662116
72.754431
35.904899
38.704135
68.508287
25.625474
false
h2oai_h2ogpt-gm-oasst1-en-1024-12b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-1024-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2oai/h2ogpt-gm-oasst1-en-1024-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-1024-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2oai/h2ogpt-gm-oasst1-en-1024-12b
e547fffafb382fd39ef5de35ba3b5afc1b43e74d
40.651055
apache-2.0
5
12
true
true
true
true
2023-10-16T12:48:18Z
false
43.088737
69.747062
25.871203
37.997429
66.140489
1.06141
false
h2oai_h2ogpt-gm-oasst1-en-1024-20b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-1024-20b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2oai/h2ogpt-gm-oasst1-en-1024-20b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-1024-20b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2oai/h2ogpt-gm-oasst1-en-1024-20b
1a5b8d25587eab67d837621a6c9423e7ef6df289
42.581692
apache-2.0
4
20
true
true
true
true
2023-09-09T10:52:17Z
false
48.037543
72.76439
25.963823
39.924157
66.298343
2.501895
false
h2oai_h2ogpt-gm-oasst1-en-1024-open-llama-7b-preview-400bt_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-1024-open-llama-7b-preview-400bt" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2oai/h2ogpt-gm-oasst1-en-1024-open-llama-7b-preview-400bt</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-1024-open-llama-7b-preview-400bt" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2oai/h2ogpt-gm-oasst1-en-1024-open-llama-7b-preview-400bt
29604e6e19822531b0d49d3f19abef603a97d0ec
39.89494
apache-2.0
3
7
true
true
true
true
2023-09-09T10:52:17Z
false
41.296928
62.437761
27.552151
42.004542
64.561957
1.5163
false
h2oai_h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2oai/h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2oai/h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt
754e0c90ed5d9241fdfd5a188572b3ea2152eaa7
34.323713
apache-2.0
12
7
true
true
true
true
2023-09-09T10:52:17Z
false
34.044369
50.507867
24.657073
41.800054
54.932912
0
false
h2oai_h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2oai/h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2oai/h2ogpt-gm-oasst1-en-2048-open-llama-7b-preview-300bt-v2
fdc6ff469295d0aaabec8948525b70d6688728ac
37.552342
apache-2.0
4
7
true
true
true
true
2023-09-09T10:52:17Z
false
36.433447
61.412069
25.01466
37.585548
64.640884
0.227445
false
h2oai_h2ogpt-gm-oasst1-multilang-1024-20b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/h2oai/h2ogpt-gm-oasst1-multilang-1024-20b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2oai/h2ogpt-gm-oasst1-multilang-1024-20b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-multilang-1024-20b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2oai/h2ogpt-gm-oasst1-multilang-1024-20b
b3a6bf4250a037c09e451344e2a4e987011b79de
41.899971
apache-2.0
10
20
true
true
true
true
2023-09-09T10:52:17Z
false
47.440273
72.575184
26.365053
34.391319
68.429361
2.198635
false
h2oai_h2ogpt-oasst1-512-12b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/h2oai/h2ogpt-oasst1-512-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2oai/h2ogpt-oasst1-512-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-oasst1-512-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2oai/h2ogpt-oasst1-512-12b
c6bb0fe363e0105839d34ca757793b61c9606f95
40.479759
apache-2.0
27
12
true
true
true
true
2023-09-09T10:52:17Z
false
42.320819
70.244971
26.013947
36.411468
66.219416
1.66793
false
h2oai_h2ogpt-oasst1-512-20b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/h2oai/h2ogpt-oasst1-512-20b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2oai/h2ogpt-oasst1-512-20b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-oasst1-512-20b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2oai/h2ogpt-oasst1-512-20b
3bdf6f870ca14bcc5587b666fbe57488f7854d30
42.444764
apache-2.0
38
20
true
true
true
true
2023-09-09T10:52:17Z
false
46.928328
72.774348
26.249891
37.497059
68.034728
3.18423
false
h2oai_h2ogpt-oig-oasst1-256-6_9b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/h2oai/h2ogpt-oig-oasst1-256-6_9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2oai/h2ogpt-oig-oasst1-256-6_9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-oig-oasst1-256-6_9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2oai/h2ogpt-oig-oasst1-256-6_9b
f1c9bac89b74d3487cb092788ce828fb9520c1a7
38.619083
apache-2.0
5
9
true
true
true
true
2023-10-16T12:48:18Z
false
39.931741
65.415256
26.392535
35.004791
63.378058
1.592115
false
h2oai_h2ogpt-oig-oasst1-512-6_9b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/h2oai/h2ogpt-oig-oasst1-512-6_9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h2oai/h2ogpt-oig-oasst1-512-6_9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-oig-oasst1-512-6_9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h2oai/h2ogpt-oig-oasst1-512-6_9b
029a787e0d98fcd3fecffbfbeb4a75a425474937
38.517421
apache-2.0
17
9
true
true
true
true
2023-10-16T12:48:18Z
false
40.443686
65.584545
24.902097
36.67874
62.509866
0.985595
false
h4rz3rk4s3_TinyNewsLlama-1.1B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/h4rz3rk4s3/TinyNewsLlama-1.1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h4rz3rk4s3/TinyNewsLlama-1.1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_h4rz3rk4s3__TinyNewsLlama-1.1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h4rz3rk4s3/TinyNewsLlama-1.1B
a4e7c60302a70746c6bfc4a79d85f040c27c675d
36.412225
apache-2.0
0
1
true
true
true
true
2024-03-17T16:06:50Z
false
32.935154
59.430392
25.181759
40.951165
59.747435
0.227445
false
h4rz3rk4s3_TinyParlaMintLlama-1.1B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/h4rz3rk4s3/TinyParlaMintLlama-1.1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h4rz3rk4s3/TinyParlaMintLlama-1.1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_h4rz3rk4s3__TinyParlaMintLlama-1.1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h4rz3rk4s3/TinyParlaMintLlama-1.1B
0c9aa196c68732bf1b563dcfb4d9c6f835087e9e
34.969809
apache-2.0
0
1
true
true
true
true
2024-03-08T15:02:00Z
false
31.65529
55.865365
24.842186
38.813548
58.642463
0
false
h4rz3rk4s3_TinyPoliticaLlama-1.1B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/h4rz3rk4s3/TinyPoliticaLlama-1.1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">h4rz3rk4s3/TinyPoliticaLlama-1.1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_h4rz3rk4s3__TinyPoliticaLlama-1.1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
h4rz3rk4s3/TinyPoliticaLlama-1.1B
8838d8f094dee1078572cf127f835cdb32117d6f
35.470572
apache-2.0
0
1
true
true
true
true
2024-03-08T15:01:21Z
false
33.788396
57.827126
25.452712
38.059852
57.695343
0
false
habanoz_TinyLlama-1.1B-2T-lr-2e-4-3ep-dolly-15k-instruct-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/habanoz/TinyLlama-1.1B-2T-lr-2e-4-3ep-dolly-15k-instruct-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">habanoz/TinyLlama-1.1B-2T-lr-2e-4-3ep-dolly-15k-instruct-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_habanoz__TinyLlama-1.1B-2T-lr-2e-4-3ep-dolly-15k-instruct-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
habanoz/TinyLlama-1.1B-2T-lr-2e-4-3ep-dolly-15k-instruct-v1
152436a0dd6ca1603b3993bbf08a227ea131f85d
34.043685
apache-2.0
1
1
true
true
true
true
2023-11-28T12:02:06Z
false
30.546075
53.704441
26.074168
35.847449
58.089976
0
false
habanoz_TinyLlama-1.1B-intermediate-step-715k-1.5T-lr-5-1epch-airoboros3.1-1k-instruct-V1_float16
float16
?
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/habanoz/TinyLlama-1.1B-intermediate-step-715k-1.5T-lr-5-1epch-airoboros3.1-1k-instruct-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">habanoz/TinyLlama-1.1B-intermediate-step-715k-1.5T-lr-5-1epch-airoboros3.1-1k-instruct-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_habanoz__TinyLlama-1.1B-intermediate-step-715k-1.5T-lr-5-1epch-airoboros3.1-1k-instruct-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
habanoz/TinyLlama-1.1B-intermediate-step-715k-1.5T-lr-5-1epch-airoboros3.1-1k-instruct-V1
2b961bacab9fcd4bf9a0d6979b024fe23f61555e
34.977882
apache-2.0
0
1
true
true
true
true
2023-11-22T16:44:39Z
false
30.716724
54.321848
24.78374
41.670411
57.616417
0.75815
false
habanoz_TinyLlama-1.1B-intermediate-step-715k-1.5T-lr-5-2.2epochs-oasst1-top1-instruct-V1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/habanoz/TinyLlama-1.1B-intermediate-step-715k-1.5T-lr-5-2.2epochs-oasst1-top1-instruct-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">habanoz/TinyLlama-1.1B-intermediate-step-715k-1.5T-lr-5-2.2epochs-oasst1-top1-instruct-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_habanoz__TinyLlama-1.1B-intermediate-step-715k-1.5T-lr-5-2.2epochs-oasst1-top1-instruct-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
habanoz/TinyLlama-1.1B-intermediate-step-715k-1.5T-lr-5-2.2epochs-oasst1-top1-instruct-V1
74cd9eba94e77832b3081689fc5c99c37c063790
35.445882
apache-2.0
0
1
true
true
true
true
2023-11-20T11:42:14Z
false
31.484642
54.401514
25.466179
42.344981
57.53749
1.440485
false
habanoz_TinyLlama-1.1B-intermediate-step-715k-1.5T-lr-5-3epochs-oasst1-top1-instruct-V1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/habanoz/TinyLlama-1.1B-intermediate-step-715k-1.5T-lr-5-3epochs-oasst1-top1-instruct-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">habanoz/TinyLlama-1.1B-intermediate-step-715k-1.5T-lr-5-3epochs-oasst1-top1-instruct-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_habanoz__TinyLlama-1.1B-intermediate-step-715k-1.5T-lr-5-3epochs-oasst1-top1-instruct-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
habanoz/TinyLlama-1.1B-intermediate-step-715k-1.5T-lr-5-3epochs-oasst1-top1-instruct-V1
b1ec2a1e08eb790b9a32a43053316650921af943
35.42161
apache-2.0
0
1
true
true
true
true
2023-11-21T10:59:47Z
false
31.399317
54.242183
25.362436
42.46571
57.695343
1.36467
false
habanoz_TinyLlama-1.1B-intermediate-step-715k-1.5T-lr-5-4epochs-oasst1-top1-instruct-V1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/habanoz/TinyLlama-1.1B-intermediate-step-715k-1.5T-lr-5-4epochs-oasst1-top1-instruct-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">habanoz/TinyLlama-1.1B-intermediate-step-715k-1.5T-lr-5-4epochs-oasst1-top1-instruct-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_habanoz__TinyLlama-1.1B-intermediate-step-715k-1.5T-lr-5-4epochs-oasst1-top1-instruct-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
habanoz/TinyLlama-1.1B-intermediate-step-715k-1.5T-lr-5-4epochs-oasst1-top1-instruct-V1
7cd6d5ad10180127771e4326772eae3d40fa8445
35.275415
apache-2.0
1
1
true
true
true
true
2023-11-21T19:31:39Z
false
31.143345
54.31189
25.418997
41.715133
57.77427
1.288855
false
habanoz_TinyLlama-1.1B-step-2T-lr-5-5ep-oasst1-top1-instruct-V1_float16
float16
?
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/habanoz/TinyLlama-1.1B-step-2T-lr-5-5ep-oasst1-top1-instruct-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">habanoz/TinyLlama-1.1B-step-2T-lr-5-5ep-oasst1-top1-instruct-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_habanoz__TinyLlama-1.1B-step-2T-lr-5-5ep-oasst1-top1-instruct-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
habanoz/TinyLlama-1.1B-step-2T-lr-5-5ep-oasst1-top1-instruct-V1
586c223b539e05fd8a63733c6a540f292460e639
34.528551
apache-2.0
3
1
true
true
true
true
2023-11-24T17:36:35Z
false
31.05802
55.018921
26.408594
35.082606
58.01105
1.592115
false
habanoz_tinyllama-oasst1-top1-instruct-full-lr1-5-v0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/habanoz/tinyllama-oasst1-top1-instruct-full-lr1-5-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">habanoz/tinyllama-oasst1-top1-instruct-full-lr1-5-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_habanoz__tinyllama-oasst1-top1-instruct-full-lr1-5-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
habanoz/tinyllama-oasst1-top1-instruct-full-lr1-5-v0.1
e55b262cbd0ee52f7a4cbda136dbf1a027987c47
35.577748
apache-2.0
2
1
true
true
true
true
2023-11-22T07:07:31Z
false
32.849829
58.155746
25.963126
38.34755
57.695343
0.45489
false
hakurei_instruct-12b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/hakurei/instruct-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hakurei/instruct-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hakurei__instruct-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hakurei/instruct-12b
ff4699b502b79c716330b6f761002588a65dcba6
38.629704
apache-2.0
17
12
true
true
true
true
2023-09-09T10:52:17Z
false
42.576792
66.75961
26.792526
31.964867
63.456985
0.227445
false
hakurei_mommygpt-3B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hakurei/mommygpt-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hakurei/mommygpt-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hakurei__mommygpt-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hakurei/mommygpt-3B
0369335d693b753774050ae44dbaf73bac39e9eb
41.36177
apache-2.0
9
3
true
true
true
true
2023-11-26T05:29:45Z
false
41.894198
71.688907
28.735468
37.904442
65.824783
2.12282
false
hamxea_Llama-2-13b-chat-hf-activity-fine-tuned-v4_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hamxea/Llama-2-13b-chat-hf-activity-fine-tuned-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hamxea/Llama-2-13b-chat-hf-activity-fine-tuned-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hamxea__Llama-2-13b-chat-hf-activity-fine-tuned-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hamxea/Llama-2-13b-chat-hf-activity-fine-tuned-v4
3536f3ba1dd84a732958ea563f2a70ecdbb03bcd
57.636119
other
0
13
true
true
true
true
2024-03-31T14:25:05Z
false
59.215017
81.666999
54.512552
43.823907
75.059195
31.539045
false
hamxea_Llama-2-7b-chat-hf-activity-fine-tuned-v3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hamxea/Llama-2-7b-chat-hf-activity-fine-tuned-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hamxea/Llama-2-7b-chat-hf-activity-fine-tuned-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hamxea__Llama-2-7b-chat-hf-activity-fine-tuned-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hamxea/Llama-2-7b-chat-hf-activity-fine-tuned-v3
471e796a06138051def6777c3742d9e196b56e08
52.991702
other
0
7
true
true
true
true
2024-03-31T14:36:53Z
false
53.327645
78.101972
48.312411
45.697951
73.480663
19.029568
false
hamxea_Llama-2-7b-chat-hf-activity-fine-tuned-v4_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hamxea/Llama-2-7b-chat-hf-activity-fine-tuned-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hamxea/Llama-2-7b-chat-hf-activity-fine-tuned-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hamxea__Llama-2-7b-chat-hf-activity-fine-tuned-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hamxea/Llama-2-7b-chat-hf-activity-fine-tuned-v4
c964836b57483ae83e5b7bc1ece1e121a7727a75
53.228962
other
0
7
true
true
true
true
2024-03-31T14:11:03Z
false
54.351536
78.121888
48.417075
45.827635
73.32281
19.332828
false
hamxea_Llama-2-7b-chat-hf-activity-fine-tuned-v4_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hamxea/Llama-2-7b-chat-hf-activity-fine-tuned-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hamxea/Llama-2-7b-chat-hf-activity-fine-tuned-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hamxea__Llama-2-7b-chat-hf-activity-fine-tuned-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hamxea/Llama-2-7b-chat-hf-activity-fine-tuned-v4
c964836b57483ae83e5b7bc1ece1e121a7727a75
53.298997
other
0
6
true
true
true
true
2024-03-31T14:26:02Z
false
54.266212
78.101972
48.439823
45.774743
73.954223
19.257013
false
hamxea_Mistral-7B-v0.1-activity-fine-tuned-v2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/hamxea/Mistral-7B-v0.1-activity-fine-tuned-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hamxea/Mistral-7B-v0.1-activity-fine-tuned-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hamxea__Mistral-7B-v0.1-activity-fine-tuned-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hamxea/Mistral-7B-v0.1-activity-fine-tuned-v2
72f64c7d384fde5d89736efa5a514cae84a2995f
60.981684
other
0
7
true
true
true
true
2024-03-31T14:53:04Z
false
60.068259
83.300139
64.088716
42.151373
78.374112
37.907506
false
hamxea_Mistral-7B-v0.1-activity-fine-tuned-v3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/hamxea/Mistral-7B-v0.1-activity-fine-tuned-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hamxea/Mistral-7B-v0.1-activity-fine-tuned-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hamxea__Mistral-7B-v0.1-activity-fine-tuned-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hamxea/Mistral-7B-v0.1-activity-fine-tuned-v3
ee975408108178dcd9b4f3bfbb5ed000357ce6b5
60.981684
other
0
7
true
true
true
true
2024-03-31T14:51:26Z
false
60.068259
83.300139
64.088716
42.151373
78.374112
37.907506
false
hamxea_Mistral-7B-v0.1-activity-fine-tuned-v5_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/hamxea/Mistral-7B-v0.1-activity-fine-tuned-v5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hamxea/Mistral-7B-v0.1-activity-fine-tuned-v5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hamxea__Mistral-7B-v0.1-activity-fine-tuned-v5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hamxea/Mistral-7B-v0.1-activity-fine-tuned-v5
ddeca14550068d75b10801ab1d261632b15f6264
60.981684
other
0
7
true
true
true
true
2024-03-31T14:42:55Z
false
60.068259
83.300139
64.088716
42.151373
78.374112
37.907506
false
hamxea_StableBeluga-7B-activity-fine-tuned-v2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hamxea/StableBeluga-7B-activity-fine-tuned-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hamxea/StableBeluga-7B-activity-fine-tuned-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hamxea__StableBeluga-7B-activity-fine-tuned-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hamxea/StableBeluga-7B-activity-fine-tuned-v2
97b647167ef3e6a043ff2c7a87ff1da117f32027
55.577725
other
0
7
true
true
true
true
2024-03-31T14:40:12Z
false
56.228669
79.057957
52.542398
50.013595
75.532755
20.090978
false
haonan-li_bactrian-x-llama-13b-merged_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/haonan-li/bactrian-x-llama-13b-merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">haonan-li/bactrian-x-llama-13b-merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_haonan-li__bactrian-x-llama-13b-merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
haonan-li/bactrian-x-llama-13b-merged
cc5ee2231066c147423f89e9df40f7364c3275a5
51.998404
0
12
false
true
true
true
2023-10-16T12:46:18Z
false
56.399317
79.326827
48.396767
48.378791
73.954223
5.534496
false
haoranxu_ALMA-13B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/haoranxu/ALMA-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">haoranxu/ALMA-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_haoranxu__ALMA-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
haoranxu/ALMA-13B
6798d9501a71b203be0610e640ec92fc08ea8dc6
50.155531
mit
30
13
true
true
true
true
2023-12-06T14:17:42Z
false
56.825939
80.29277
49.922975
37.569482
76.322021
0
false
haoranxu_ALMA-13B-Pretrain_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/haoranxu/ALMA-13B-Pretrain" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">haoranxu/ALMA-13B-Pretrain</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_haoranxu__ALMA-13B-Pretrain" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
haoranxu/ALMA-13B-Pretrain
69e9e12d8bab66dffdcb15fa534fc3f0dc34acec
51.681126
mit
8
13
true
true
true
true
2023-10-16T12:48:18Z
false
56.911263
80.153356
50.308104
37.442733
76.400947
8.870356
false
haoranxu_ALMA-13B-R_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/haoranxu/ALMA-13B-R" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">haoranxu/ALMA-13B-R</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_haoranxu__ALMA-13B-R" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
haoranxu/ALMA-13B-R
f0a3613c5da62cbe85fb90ea348932ddfc022b22
49.315284
mit
70
13
true
true
true
true
2024-01-20T05:09:17Z
false
55.546075
79.446325
49.517689
36.085639
75.295975
0
false
haoranxu_ALMA-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/haoranxu/ALMA-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">haoranxu/ALMA-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_haoranxu__ALMA-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
haoranxu/ALMA-7B
b570315dd26452a07cf15cf6feecce839e1327a6
45.317245
mit
21
7
true
true
true
true
2023-12-06T21:49:05Z
false
50.341297
75.502888
38.039746
35.643848
72.375691
0
false
harborwater_dpo-test-hermes-open-llama-3b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/harborwater/dpo-test-hermes-open-llama-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">harborwater/dpo-test-hermes-open-llama-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_harborwater__dpo-test-hermes-open-llama-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
harborwater/dpo-test-hermes-open-llama-3b
5cd560152a364f61f92cebe18feaefc181dfd287
39.415776
0
3
false
true
true
true
2024-01-14T03:58:03Z
false
39.249147
67.456682
24.214428
39.805627
64.404104
1.36467
false
harborwater_open-llama-3b-claude-30k_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/harborwater/open-llama-3b-claude-30k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">harborwater/open-llama-3b-claude-30k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_harborwater__open-llama-3b-claude-30k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
harborwater/open-llama-3b-claude-30k
049db7fda44e5ce1e8febf5c3f45e3a93aaaa859
40.931675
apache-2.0
4
3
true
true
true
true
2023-11-21T01:41:36Z
false
41.723549
72.644891
24.028403
38.45945
66.535122
2.198635
false
harborwater_open-llama-3b-everything-v2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/harborwater/open-llama-3b-everything-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">harborwater/open-llama-3b-everything-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_harborwater__open-llama-3b-everything-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
harborwater/open-llama-3b-everything-v2
31ce2c1611d9f7d56184ceb5bff6a7e95a180c03
41.408989
apache-2.0
1
3
true
true
true
true
2023-10-16T13:19:55Z
false
42.832765
73.282215
26.873957
37.258831
66.614049
1.592115
false
harborwater_open-llama-3b-everythingLM-2048_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/harborwater/open-llama-3b-everythingLM-2048" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">harborwater/open-llama-3b-everythingLM-2048</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_harborwater__open-llama-3b-everythingLM-2048" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
harborwater/open-llama-3b-everythingLM-2048
1f9e8d48163feb63ed190eaa982f393542a75d30
40.617889
apache-2.0
2
3
true
true
true
true
2023-10-16T12:48:18Z
false
42.74744
71.718781
27.163964
34.262508
66.298343
1.5163
false
harborwater_open-llama-3b-v2-wizard-evol-instuct-v2-196k_8bit
8bit
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/harborwater/open-llama-3b-v2-wizard-evol-instuct-v2-196k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">harborwater/open-llama-3b-v2-wizard-evol-instuct-v2-196k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_harborwater__open-llama-3b-v2-wizard-evol-instuct-v2-196k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
harborwater/open-llama-3b-v2-wizard-evol-instuct-v2-196k
4da0c661e6df1235c9997b996c8e395b87248406
41.0931
apache-2.0
4
3
true
true
true
true
2023-10-16T12:48:18Z
false
41.211604
72.883888
25.387343
38.869603
66.614049
1.592115
false
harborwater_open-llama-3b-v2-wizard-evol-instuct-v2-196k_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/harborwater/open-llama-3b-v2-wizard-evol-instuct-v2-196k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">harborwater/open-llama-3b-v2-wizard-evol-instuct-v2-196k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_harborwater__open-llama-3b-v2-wizard-evol-instuct-v2-196k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
harborwater/open-llama-3b-v2-wizard-evol-instuct-v2-196k
4da0c661e6df1235c9997b996c8e395b87248406
41.460419
apache-2.0
4
3
true
true
true
true
2023-10-16T12:54:17Z
false
41.808874
73.013344
26.358882
38.993062
66.692976
1.895375
false
harborwater_wizard-orca-3b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/harborwater/wizard-orca-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">harborwater/wizard-orca-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_harborwater__wizard-orca-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
harborwater/wizard-orca-3b
ffc81b58375342f12e38a67272d95458a72e8d09
41.003345
apache-2.0
4
3
true
true
true
true
2023-10-16T12:48:18Z
false
41.723549
71.77853
24.491112
40.035714
66.929755
1.06141
false
harshitv804_MetaMath-Mistral-2x7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/harshitv804/MetaMath-Mistral-2x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">harshitv804/MetaMath-Mistral-2x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_harshitv804__MetaMath-Mistral-2x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
harshitv804/MetaMath-Mistral-2x7B
193485a4016e12c1a3d3347801648fa4913dbd7c
65.843527
apache-2.0
2
7
true
false
false
true
2024-03-09T16:04:28Z
false
60.580205
82.593109
61.865053
44.797379
76.006314
69.219105
false
health360_Healix-1.1B-V1-Chat-dDPO_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/health360/Healix-1.1B-V1-Chat-dDPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">health360/Healix-1.1B-V1-Chat-dDPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_health360__Healix-1.1B-V1-Chat-dDPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
health360/Healix-1.1B-V1-Chat-dDPO
07dd0532fda09df289f6617e1135b09fb705080d
33.004554
apache-2.0
2
1
true
true
true
true
2024-02-22T01:33:28Z
false
30.546075
44.781916
24.636819
41.551072
56.511444
0
false
health360_Healix-3B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/health360/Healix-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">health360/Healix-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_health360__Healix-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
health360/Healix-3B
52297e0b6845b3c1b26f336fd2a2c9b2f56ce6ba
38.929456
null
0
3
false
true
true
true
2023-10-16T12:48:18Z
false
37.713311
65.943039
26.018388
37.39799
65.745856
0.75815
false
health360_Healix-410M_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/health360/Healix-410M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">health360/Healix-410M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_health360__Healix-410M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
health360/Healix-410M
df5a3cec54a0bdd22e1644bfe576c7b58eca6bfd
30.099863
0
0
false
true
true
true
2023-10-16T12:58:30Z
false
25.085324
32.015535
24.939662
44.415008
54.143646
0
false
hedronstone_OpenHermes-7B-Reasoner_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/hedronstone/OpenHermes-7B-Reasoner" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hedronstone/OpenHermes-7B-Reasoner</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hedronstone__OpenHermes-7B-Reasoner" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hedronstone/OpenHermes-7B-Reasoner
d26f2defbf9f40a65dbb2ead08c79cd61096ed08
64.435087
0
7
false
true
true
true
2023-12-11T02:52:23Z
false
63.139932
82.732523
62.618294
48.821728
75.848461
53.449583
false
hedronstone_OpenHermes-7B-Symbolic_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/hedronstone/OpenHermes-7B-Symbolic" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hedronstone/OpenHermes-7B-Symbolic</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hedronstone__OpenHermes-7B-Symbolic" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hedronstone/OpenHermes-7B-Symbolic
23eb76553aa37cd48c1f2d8a314d78fd3ead53f6
64.435087
apache-2.0
0
7
true
true
true
true
2023-12-11T03:41:45Z
false
63.139932
82.732523
62.618294
48.821728
75.848461
53.449583
false
heegyu_LIMA-13b-hf_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/heegyu/LIMA-13b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">heegyu/LIMA-13b-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__LIMA-13b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
heegyu/LIMA-13b-hf
98faa74a9b41cbd9033904cd58420705936849eb
52.606002
other
1
13
true
true
true
true
2023-09-09T10:52:17Z
false
57.423208
81.676957
48.715286
41.759991
77.190213
8.870356
false
heegyu_LIMA2-13b-hf_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/heegyu/LIMA2-13b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">heegyu/LIMA2-13b-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__LIMA2-13b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
heegyu/LIMA2-13b-hf
ed3535921eb24e0737f9a6cda70b1a3fd71532cd
52.984156
null
0
13
false
true
true
true
2023-09-09T10:52:17Z
false
60.238908
83.688508
53.166131
41.805567
73.243883
5.761941
false
heegyu_LIMA2-7b-hf_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/heegyu/LIMA2-7b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">heegyu/LIMA2-7b-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__LIMA2-7b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
heegyu/LIMA2-7b-hf
6a1aa59cb7624f059728840ce68b20b1070ebdcb
49.266792
null
2
7
false
true
true
true
2023-09-09T10:52:17Z
false
53.242321
80.601474
43.219879
44.741549
69.928966
3.866566
false
heegyu_RedTulu-Uncensored-3B-0719_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/heegyu/RedTulu-Uncensored-3B-0719" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">heegyu/RedTulu-Uncensored-3B-0719</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__RedTulu-Uncensored-3B-0719" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
heegyu/RedTulu-Uncensored-3B-0719
c92bf022cddc3f57b4552ec3391df487295a2f87
39.192
apache-2.0
2
3
true
true
true
true
2023-09-09T10:52:17Z
false
40.017065
62.547301
30.368248
37.592921
62.352013
2.27445
false
heegyu_WizardVicuna-3B-0719_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/heegyu/WizardVicuna-3B-0719" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">heegyu/WizardVicuna-3B-0719</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__WizardVicuna-3B-0719" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
heegyu/WizardVicuna-3B-0719
62d3d450b8ab2bd2fb9f82383b55d1ecae33a401
39.483782
apache-2.0
1
3
true
true
true
true
2023-09-09T10:52:17Z
false
40.699659
65.44513
25.442489
40.705648
63.851618
0.75815
false
heegyu_WizardVicuna-Uncensored-3B-0719_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/heegyu/WizardVicuna-Uncensored-3B-0719" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">heegyu/WizardVicuna-Uncensored-3B-0719</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__WizardVicuna-Uncensored-3B-0719" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
heegyu/WizardVicuna-Uncensored-3B-0719
36841c80535bc3e8403e3cc084e8e65884c75076
39.726839
apache-2.0
5
3
true
true
true
true
2023-09-09T10:52:17Z
false
41.382253
66.191994
26.531367
39.345504
63.772691
1.137225
false
heegyu_WizardVicuna-open-llama-3b-v2_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/heegyu/WizardVicuna-open-llama-3b-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">heegyu/WizardVicuna-open-llama-3b-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__WizardVicuna-open-llama-3b-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
heegyu/WizardVicuna-open-llama-3b-v2
1c69905286171d7d3ef3f95f8e1bbc9150bad3cd
38.770953
apache-2.0
0
3
true
true
true
true
2023-09-09T10:52:17Z
false
37.713311
66.600279
27.229782
36.797619
63.299132
0.985595
false
heegyu_WizardVicuna2-13b-hf_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/heegyu/WizardVicuna2-13b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">heegyu/WizardVicuna2-13b-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__WizardVicuna2-13b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
heegyu/WizardVicuna2-13b-hf
6cfd95e2dcdb6996afa9eb5c63273a1a3524c6c6
51.050916
null
0
13
false
true
true
true
2023-10-16T12:46:18Z
false
55.375427
79.137622
48.455095
42.426819
73.480663
7.429871
false
hexinran09_xr_dat_test_part2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hexinran09/xr_dat_test_part2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hexinran09/xr_dat_test_part2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hexinran09__xr_dat_test_part2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hexinran09/xr_dat_test_part2
00c11f89113bdb12ebb884306cf81cdc165ba28d
51.254933
apache-2.0
0
13
true
true
true
true
2024-04-22T03:44:13Z
false
57.679181
82.065326
53.023688
39.097662
73.08603
2.57771
false
hfl_chinese-alpaca-2-1.3b_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hfl/chinese-alpaca-2-1.3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hfl/chinese-alpaca-2-1.3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hfl__chinese-alpaca-2-1.3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hfl/chinese-alpaca-2-1.3b
328cd57b6396c32772e34d1eddb43340641044e0
29.34116
apache-2.0
9
1
true
true
true
true
2024-05-29T03:08:10Z
false
24.488055
30.173272
25.880136
44.597842
50.907656
0
false
hfl_chinese-alpaca-2-1.3b-rlhf_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hfl/chinese-alpaca-2-1.3b-rlhf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hfl/chinese-alpaca-2-1.3b-rlhf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hfl__chinese-alpaca-2-1.3b-rlhf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hfl/chinese-alpaca-2-1.3b-rlhf
69c96ad69b6723d29723dbcd2bbf4ff338a7b0bd
29.388139
apache-2.0
2
1
true
true
true
true
2024-05-29T03:10:27Z
false
23.890785
30.013941
26.533731
45.061647
50.828729
0
false
hfl_chinese-alpaca-2-13b_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hfl/chinese-alpaca-2-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hfl/chinese-alpaca-2-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hfl__chinese-alpaca-2-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hfl/chinese-alpaca-2-13b
3b2e3895ff83c8892ab20fb8f98754d947879186
57.405411
apache-2.0
84
13
true
true
true
true
2023-12-06T19:12:52Z
false
58.703072
79.755029
55.121145
50.222586
75.611681
25.018954
false
hfl_chinese-alpaca-2-13b-16k_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hfl/chinese-alpaca-2-13b-16k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hfl/chinese-alpaca-2-13b-16k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hfl__chinese-alpaca-2-13b-16k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hfl/chinese-alpaca-2-13b-16k
ba4536aed022c49bda60e1b56a0dbefc2ea6a30a
54.117649
apache-2.0
29
13
true
true
true
true
2023-12-06T13:32:24Z
false
55.03413
77.414858
51.281905
46.496695
73.401736
21.076573
false
hfl_chinese-alpaca-2-7b_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hfl/chinese-alpaca-2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hfl/chinese-alpaca-2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hfl__chinese-alpaca-2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hfl/chinese-alpaca-2-7b
b9eeeddf488d3c1f67a374929a62a06fc2d51adf
50.214097
apache-2.0
159
7
true
true
true
true
2024-05-29T03:08:25Z
false
49.573379
72.644891
46.548685
48.629365
70.165746
13.722517
false
hfl_chinese-alpaca-2-7b-16k_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hfl/chinese-alpaca-2-7b-16k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hfl/chinese-alpaca-2-7b-16k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hfl__chinese-alpaca-2-7b-16k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hfl/chinese-alpaca-2-7b-16k
08fca93c1910003ede0beaf9f5dba887ab67d84c
48.021648
apache-2.0
18
7
true
true
true
true
2024-05-29T03:10:01Z
false
48.464164
70.30472
42.935223
48.592247
68.508287
9.325246
false
hfl_chinese-alpaca-2-7b-rlhf_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hfl/chinese-alpaca-2-7b-rlhf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hfl/chinese-alpaca-2-7b-rlhf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hfl__chinese-alpaca-2-7b-rlhf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hfl/chinese-alpaca-2-7b-rlhf
e802d0bcf0235c8db42ad04880e2433ef0f4cfda
50.923428
apache-2.0
2
7
true
true
true
true
2024-05-29T03:10:35Z
false
49.488055
72.605059
46.290504
51.190567
70.955012
15.011372
false
hfl_chinese-llama-2-1.3b_float16
float16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hfl/chinese-llama-2-1.3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hfl/chinese-llama-2-1.3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hfl__chinese-llama-2-1.3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hfl/chinese-llama-2-1.3b
99f0fa34dfcdcd1497efdcefce45712cd9fed9ea
28.588908
apache-2.0
14
1
true
true
true
true
2024-05-29T03:03:55Z
false
22.354949
28.301135
24.918203
46.077549
49.88161
0
false
hfl_chinese-llama-2-13b_float16
float16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hfl/chinese-llama-2-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hfl/chinese-llama-2-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hfl__chinese-llama-2-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hfl/chinese-llama-2-13b
043f8d278c0338b7cc89656ec3581ec8955838b5
51.995837
apache-2.0
33
13
true
true
true
true
2024-05-29T03:07:09Z
false
55.802048
79.506074
52.938843
38.257314
75.690608
9.780136
false