eval_name
stringlengths
9
97
Precision
stringclasses
5 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
3 values
Architecture
stringclasses
53 values
Model
stringlengths
355
611
fullname
stringlengths
4
89
Model sha
stringlengths
0
40
Average ⬆️
float64
27
81.3
Hub License
stringclasses
35 values
Hub ❤️
int64
0
4.88k
#Params (B)
int64
0
238
Available on the hub
bool
2 classes
Merged
bool
2 classes
MoE
bool
2 classes
Flagged
bool
1 class
date
stringlengths
0
26
Chat Template
bool
2 classes
ARC
float64
19.7
87.5
HellaSwag
float64
20.7
92.8
MMLU
float64
17.8
89.4
TruthfulQA
float64
27.9
82.3
Winogrande
float64
47.2
91.5
GSM8K
float64
0
88.2
Maintainers Choice
bool
2 classes
pankajmathur_orca_mini_v4_8b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v4_8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v4_8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__orca_mini_v4_8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pankajmathur/orca_mini_v4_8b
665bccf382c7176377243b1462e7f212bc8c13e6
66.64627
llama3
2
8
true
true
true
true
2024-05-29T03:43:46Z
false
58.020478
81.647082
63.231271
55.77665
73.954223
67.247915
false
pankajmathur_orca_mini_v5_8b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v5_8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v5_8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__orca_mini_v5_8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pankajmathur/orca_mini_v5_8b
efb9b7beea4e79bd7b02ff8faf9022602c566c31
67.276781
llama3
2
8
true
true
true
true
2024-05-29T03:57:15Z
false
60.921502
81.776539
64.969614
55.040121
73.401736
67.551175
false
pankajmathur_orca_mini_v5_8b_dpo_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v5_8b_dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v5_8b_dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__orca_mini_v5_8b_dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pankajmathur/orca_mini_v5_8b_dpo
934ae52f399e0a2ea84bb87746fa42651bb1edc2
67.777401
llama3
2
8
true
true
true
true
2024-05-30T20:04:59Z
false
61.860068
82.354113
65.10224
56.243441
73.401736
67.702805
false
pankajmathur_orca_mini_v5_8b_orpo_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pankajmathur/orca_mini_v5_8b_orpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pankajmathur/orca_mini_v5_8b_orpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pankajmathur__orca_mini_v5_8b_orpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pankajmathur/orca_mini_v5_8b_orpo
4cdc018043ef439f15bd8a09c4f09c6bc528dfc7
65.983592
llama3
0
8
true
true
true
true
2024-05-31T16:15:23Z
false
57.081911
79.934276
64.668355
53.435534
74.822415
65.95906
false
pansophic_gemma-2b-sft-preview_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/pansophic/gemma-2b-sft-preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pansophic/gemma-2b-sft-preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pansophic__gemma-2b-sft-preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pansophic/gemma-2b-sft-preview
be9dd0afceebbe5573dd73ca9ff488bb3f1a3df1
54.771756
other
0
2
true
true
true
true
2024-04-09T17:06:38Z
false
52.303754
73.600876
45.831757
51.295626
67.008682
38.589841
false
pansophic_m16_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/pansophic/m16" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pansophic/m16</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pansophic__m16" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pansophic/m16
61b76c29f02a6b27f17b3e73ce50c218dfc6b7ff
60.24587
0
0
false
true
true
true
2024-03-15T15:29:28Z
false
59.812287
74.815774
56.307555
47.107319
75.138122
48.294162
false
pansophic_m17_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/pansophic/m17" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pansophic/m17</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pansophic__m17" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pansophic/m17
af805fe99130a741b4d688f9e048b6f69362522f
60.421329
0
0
false
true
true
true
2024-03-15T17:29:14Z
false
59.641638
74.407489
56.124605
46.616392
75.927388
49.810462
false
pansophic_m2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/pansophic/m2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pansophic/m2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pansophic__m2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pansophic/m2
1be3a323f2d735eb6aad1905c5bfb2bec4475d6f
60.385082
0
0
false
true
true
true
2024-03-16T00:16:07Z
false
61.262799
75.283808
54.728867
48.17078
74.191002
48.673237
false
pansophic_m3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/pansophic/m3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pansophic/m3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pansophic__m3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pansophic/m3
c55ddfa2a2e72141f5cf6ddefb5596d79efcfd72
60.234115
0
0
false
true
true
true
2024-03-17T21:15:12Z
false
60.409556
74.487154
56.51351
44.983652
76.716654
48.294162
false
pansophic_new_model_test_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/pansophic/new_model_test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pansophic/new_model_test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pansophic__new_model_test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pansophic/new_model_test
b80248dbdf3e3d4ee4a8d498afd8a4d96892ff85
54.62725
0
2
false
true
true
true
2024-02-27T13:00:59Z
false
52.559727
73.650667
46.023145
51.245187
66.377269
37.907506
false
pansophic_new_model_test2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/pansophic/new_model_test2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pansophic/new_model_test2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pansophic__new_model_test2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pansophic/new_model_test2
f8b3ddd61dcf89f6ee6c5cac4185ff6c00f767a0
61.703655
0
0
false
true
true
true
2024-03-01T08:06:49Z
false
62.030717
75.363473
56.030346
46.542897
77.03236
53.222138
false
pansophic_new_model_test3_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
StableLmForCausalLM
<a target="_blank" href="https://huggingface.co/pansophic/new_model_test3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pansophic/new_model_test3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pansophic__new_model_test3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pansophic/new_model_test3
5fc0394d59ea72784285eeb2252411b88e9b6d9d
56.524083
0
0
false
true
true
true
2024-03-01T08:08:31Z
false
51.791809
78.609839
49.141937
46.890503
70.481452
42.228961
false
pansophic_nmt_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pansophic/nmt" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pansophic/nmt</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pansophic__nmt" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pansophic/nmt
337733e4abfe2f44914dbb6025d82845d446c05c
64.057381
0
6
false
true
true
true
2024-05-28T21:17:31Z
false
62.457338
78.799044
63.319523
55.623486
72.059984
52.084913
false
pansophic_rocket-3B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
StableLmForCausalLM
<a target="_blank" href="https://huggingface.co/pansophic/rocket-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pansophic/rocket-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pansophic__rocket-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pansophic/rocket-3B
ddf1caac5a50ff0984f08c9e195eaf952e3b0ca8
55.771603
cc-by-sa-4.0
76
2
true
true
true
true
2024-02-27T12:25:48Z
false
50.59727
76.687911
47.103584
55.818031
67.955801
36.46702
false
paulilioaica_Collin-7B-dare_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/paulilioaica/Collin-7B-dare" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paulilioaica/Collin-7B-dare</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_paulilioaica__Collin-7B-dare" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
paulilioaica/Collin-7B-dare
c8cc55a64ad062fe5ea9b6268c4affadc0975219
60.651161
0
7
false
true
true
true
2024-01-28T17:19:18Z
false
65.870307
82.075284
51.857718
65.202347
77.900552
21.000758
false
paulilioaica_Hugo-7B-slerp_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/paulilioaica/Hugo-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paulilioaica/Hugo-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_paulilioaica__Hugo-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
paulilioaica/Hugo-7B-slerp
820dcd204a79f46110fad378907f0be35a266ecb
67.071279
apache-2.0
0
7
true
false
true
true
2024-01-28T14:50:54Z
false
64.505119
84.773949
62.539829
57.127623
80.031571
53.449583
false
paulilioaica_MoEstral-2x2B_float16
float16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/paulilioaica/MoEstral-2x2B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paulilioaica/MoEstral-2x2B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_paulilioaica__MoEstral-2x2B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
paulilioaica/MoEstral-2x2B
fa00d779934bc7907f6031c318852b1faa513bf6
66.343206
0
12
false
true
false
true
2024-01-29T11:34:32Z
false
65.102389
84.82374
61.6218
62.724005
78.374112
45.413192
false
paulml_DPOB-INMTOB-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/paulml/DPOB-INMTOB-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paulml/DPOB-INMTOB-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_paulml__DPOB-INMTOB-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
paulml/DPOB-INMTOB-7B
a8871af9db183f2e7fe7c30bb2242b3b7827e53f
76.208284
cc-by-nc-4.0
2
7
true
false
true
true
2024-02-12T14:36:47Z
false
73.208191
88.996216
64.537195
76.600759
84.68824
69.219105
false
paulml_DPOB-NMTOB-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/paulml/DPOB-NMTOB-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paulml/DPOB-NMTOB-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_paulml__DPOB-NMTOB-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
paulml/DPOB-NMTOB-7B
547fe9adccf3ab12b91bb77f6ee5daa033757a15
75.998996
cc-by-nc-4.0
0
7
true
false
true
true
2024-02-12T12:00:40Z
false
73.122867
88.946425
64.695744
75.075479
85.1618
68.99166
false
paulml_NMTOB-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/paulml/NMTOB-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paulml/NMTOB-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_paulml__NMTOB-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
paulml/NMTOB-7B
27380e38769851edfc8d720ec88a066b40d8a85e
75.97
cc-by-nc-4.0
0
7
true
false
true
true
2024-02-12T11:38:37Z
false
73.037543
88.936467
64.628983
75.063545
85.1618
68.99166
false
paulml_NeuralOmniBeagleMBX-v3-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/paulml/NeuralOmniBeagleMBX-v3-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paulml/NeuralOmniBeagleMBX-v3-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_paulml__NeuralOmniBeagleMBX-v3-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
paulml/NeuralOmniBeagleMBX-v3-7B
37084955ee092548abfe356be4e6cfc46daa9cb4
75.926516
cc-by-nc-4.0
1
7
true
false
true
true
2024-02-05T13:33:37Z
false
73.37884
88.906592
64.992577
73.103559
84.21468
70.962851
false
paulml_NeuralOmniWestBeaglake-7B_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/paulml/NeuralOmniWestBeaglake-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paulml/NeuralOmniWestBeaglake-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_paulml__NeuralOmniWestBeaglake-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
paulml/NeuralOmniWestBeaglake-7B
b02cba26616d558094f7dca72419367c56937a47
74.434227
cc-by-nc-4.0
2
7
true
false
true
true
2024-02-05T08:59:04Z
false
73.720137
89.693288
63.95777
75.097625
84.92502
59.211524
false
paulml_OGNO-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/paulml/OGNO-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paulml/OGNO-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_paulml__OGNO-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
paulml/OGNO-7B
a5d97f2e6962dc2c539a5bbca6a1160f87ccce84
76.34085
cc-by-nc-4.0
17
7
true
false
true
true
2024-02-12T17:31:57Z
false
73.122867
88.996216
64.586211
76.522684
84.68824
70.128886
false
paulml_OmniBeagleMBX-v3-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/paulml/OmniBeagleMBX-v3-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paulml/OmniBeagleMBX-v3-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_paulml__OmniBeagleMBX-v3-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
paulml/OmniBeagleMBX-v3-7B
01bc122ec9d4a523fc012e792e2ba23f0f9bea68
75.956809
cc-by-nc-4.0
1
7
true
false
true
true
2024-02-04T16:58:10Z
false
73.805461
89.065923
64.660484
73.515488
85.398579
69.29492
false
paulml_OmniBeagleSquaredMBX-v3-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/paulml/OmniBeagleSquaredMBX-v3-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paulml/OmniBeagleSquaredMBX-v3-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_paulml__OmniBeagleSquaredMBX-v3-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
paulml/OmniBeagleSquaredMBX-v3-7B
88928f55d51c0819de3b64e6c37689b87a89aac4
75.910541
cc-by-nc-4.0
1
7
true
false
true
true
2024-02-09T18:14:50Z
false
74.40273
88.816969
65.088255
72.695458
85.240726
69.219105
false
paulml_OmniBeagleSquaredMBX-v3-7B-v2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/paulml/OmniBeagleSquaredMBX-v3-7B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">paulml/OmniBeagleSquaredMBX-v3-7B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_paulml__OmniBeagleSquaredMBX-v3-7B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
paulml/OmniBeagleSquaredMBX-v3-7B-v2
7eb4f63abc5c6891503008eb613287eff8c15e30
75.983359
cc-by-nc-4.0
1
7
true
false
true
true
2024-02-09T22:34:41Z
false
74.061433
88.926509
64.528837
72.925501
85.556433
69.90144
false
pe-nlp_llama-2-13b-platypus-vicuna-wizard_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pe-nlp/llama-2-13b-platypus-vicuna-wizard" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pe-nlp/llama-2-13b-platypus-vicuna-wizard</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pe-nlp__llama-2-13b-platypus-vicuna-wizard" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pe-nlp/llama-2-13b-platypus-vicuna-wizard
71aa919fc15fa9d9def9185791b15a3f76e7bd8d
52.895458
2
13
false
true
true
true
2023-09-09T10:52:17Z
false
61.262799
82.31428
55.21067
41.905685
75.769534
0.90978
false
pe-nlp_llama-2-13b-vicuna-wizard_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pe-nlp/llama-2-13b-vicuna-wizard" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pe-nlp/llama-2-13b-vicuna-wizard</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pe-nlp__llama-2-13b-vicuna-wizard" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pe-nlp/llama-2-13b-vicuna-wizard
b51bf8c4e132308751cc8b9d9c1131539f79f07f
51.935837
0
13
false
true
true
true
2023-09-09T10:52:17Z
false
57.764505
82.164907
54.681539
41.114022
74.980268
0.90978
false
perlthoughts_Chupacabra-16B-v2.01_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/perlthoughts/Chupacabra-16B-v2.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">perlthoughts/Chupacabra-16B-v2.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-16B-v2.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
perlthoughts/Chupacabra-16B-v2.01
3b723559b550a34e489cc41ec5414e00531ec2ae
63.41598
0
14
false
true
true
true
2023-12-07T06:38:48Z
false
65.358362
82.921729
63.274661
64.534461
79.084451
25.322214
false
perlthoughts_Chupacabra-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/perlthoughts/Chupacabra-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">perlthoughts/Chupacabra-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
perlthoughts/Chupacabra-7B
ae20703e16d89ba4a4301d12195cede64bd2ebdd
67.761163
apache-2.0
5
7
true
true
true
true
2023-11-21T01:19:24Z
false
66.808874
83.519219
62.680607
52.305519
79.084451
62.168309
false
perlthoughts_Chupacabra-7B-v2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/perlthoughts/Chupacabra-7B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">perlthoughts/Chupacabra-7B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
perlthoughts/Chupacabra-7B-v2
0c7f7c85359f15d3e6c361e8192738bdfb14ea6c
67.037552
apache-2.0
33
7
true
false
true
true
2023-12-03T03:34:18Z
false
65.187713
83.389763
63.601288
57.170775
78.137332
54.738438
false
perlthoughts_Chupacabra-7B-v2.01_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/perlthoughts/Chupacabra-7B-v2.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">perlthoughts/Chupacabra-7B-v2.01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
perlthoughts/Chupacabra-7B-v2.01
438642201e2a91e9456d2a8ca1d7443e5ec55a40
70.425134
apache-2.0
4
7
true
true
true
true
2023-12-07T06:38:23Z
false
68.856655
86.118303
63.902623
63.501677
80.50513
59.666414
false
perlthoughts_Chupacabra-7B-v2.02_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/perlthoughts/Chupacabra-7B-v2.02" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">perlthoughts/Chupacabra-7B-v2.02</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.02" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
perlthoughts/Chupacabra-7B-v2.02
24fb5e81b1d39d4358930a1f9054513e9e2d6373
69.822041
apache-2.0
4
7
true
true
true
true
2023-12-10T20:44:25Z
false
67.662116
83.89763
61.975676
64.0558
79.400158
61.940864
false
perlthoughts_Chupacabra-7B-v2.03_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/perlthoughts/Chupacabra-7B-v2.03" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">perlthoughts/Chupacabra-7B-v2.03</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.03" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
perlthoughts/Chupacabra-7B-v2.03
73641ebe6ba450a83f6e80ed919fba48cc5f2837
65.343216
0
7
false
true
true
true
2023-12-10T23:09:14Z
false
63.822526
84.734117
63.045824
48.533561
80.899763
51.023503
false
perlthoughts_Chupacabra-7B-v2.03-128k_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/perlthoughts/Chupacabra-7B-v2.03-128k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">perlthoughts/Chupacabra-7B-v2.03-128k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.03-128k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
perlthoughts/Chupacabra-7B-v2.03-128k
22bb3c15b2770dfe91e239573b6c35b475a43cbe
65.828009
0
7
false
true
true
true
2023-12-10T23:09:46Z
false
64.675768
84.564828
63.01603
51.161014
81.057616
50.492798
false
perlthoughts_Chupacabra-7B-v2.04_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/perlthoughts/Chupacabra-7B-v2.04" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">perlthoughts/Chupacabra-7B-v2.04</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.04" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
perlthoughts/Chupacabra-7B-v2.04
b6eb3c3293fff1cb3d38bbfefa9adfce3e20f053
68.515904
apache-2.0
2
7
true
true
true
true
2024-01-05T08:54:21Z
false
66.296928
85.70006
60.935373
67.758073
78.926598
51.478393
false
perlthoughts_Chupacabra-8x7B-MoE_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/perlthoughts/Chupacabra-8x7B-MoE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">perlthoughts/Chupacabra-8x7B-MoE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-8x7B-MoE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
perlthoughts/Chupacabra-8x7B-MoE
4df8e16bb4adeff6cfdd6c064819650ae27ff8fa
70.402633
apache-2.0
2
46
true
true
false
true
2023-12-16T07:59:48Z
false
68.771331
86.108345
63.860885
63.503694
80.50513
59.666414
false
perlthoughts_Chupacabra-v3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/perlthoughts/Chupacabra-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">perlthoughts/Chupacabra-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
perlthoughts/Chupacabra-v3
1dfa5e16d4be646b496d657d86554482ad48b3c9
59.524864
0
7
false
true
true
true
2023-11-23T23:41:14Z
false
66.211604
81.288588
59.362008
57.848622
77.426993
15.011372
false
perlthoughts_Falkor-16b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/perlthoughts/Falkor-16b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">perlthoughts/Falkor-16b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Falkor-16b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
perlthoughts/Falkor-16b
2365c7af9eb60bfa946b566dadd6802befa122e8
63.517314
0
14
false
true
true
true
2023-12-07T06:39:15Z
false
65.955631
82.622983
63.578853
62.766866
77.900552
28.278999
false
perlthoughts_Falkor-7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/perlthoughts/Falkor-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">perlthoughts/Falkor-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Falkor-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
perlthoughts/Falkor-7b
b2e3c235196ba859b26ee14fb8c86e632bcf3e88
70.334376
apache-2.0
3
7
true
true
true
true
2023-12-07T06:37:12Z
false
68.259386
85.839474
63.984318
63.07542
80.347277
60.500379
false
perlthoughts_Falkor-8x7B-MoE_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/perlthoughts/Falkor-8x7B-MoE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">perlthoughts/Falkor-8x7B-MoE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Falkor-8x7B-MoE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
perlthoughts/Falkor-8x7B-MoE
8a13e5399c12811d178cea09ffa719596410c9b4
68.312943
apache-2.0
4
46
true
true
false
true
2023-12-16T08:02:21Z
false
66.296928
85.032862
64.128236
53.502386
80.189424
60.727824
false
perlthoughts_Marcoroni-8x7B-v3-MoE_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/perlthoughts/Marcoroni-8x7B-v3-MoE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">perlthoughts/Marcoroni-8x7B-v3-MoE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Marcoroni-8x7B-v3-MoE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
perlthoughts/Marcoroni-8x7B-v3-MoE
901a733d1c01035bcbe69afd25dd9b4f982cb216
72.453808
0
46
false
true
false
true
2023-12-17T22:51:23Z
false
69.368601
86.775543
65.006791
60.398662
81.452249
71.721001
false
perlthoughts_Mistral-7B-Instruct-v0.2-2x7B-MoE_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/perlthoughts/Mistral-7B-Instruct-v0.2-2x7B-MoE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">perlthoughts/Mistral-7B-Instruct-v0.2-2x7B-MoE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Mistral-7B-Instruct-v0.2-2x7B-MoE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
perlthoughts/Mistral-7B-Instruct-v0.2-2x7B-MoE
46a2d11c1025e6ddec0fe35093d39e2e16170ca2
65.604424
apache-2.0
4
12
true
true
false
true
2023-12-19T04:40:51Z
false
62.969283
84.883489
60.741609
68.181364
77.426993
39.423806
false
perlthoughts_Starling-LM-alpha-8x7B-MoE_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/perlthoughts/Starling-LM-alpha-8x7B-MoE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">perlthoughts/Starling-LM-alpha-8x7B-MoE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Starling-LM-alpha-8x7B-MoE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
perlthoughts/Starling-LM-alpha-8x7B-MoE
61a66c526af1238690c815051c0f4ebe866ca588
67.113917
cc-by-nc-4.0
5
46
true
true
false
true
2023-12-16T08:02:07Z
false
63.651877
84.903406
64.680102
46.392492
80.584057
62.471569
false
perlthoughts_neural-chat-v3-3-8x7b-MoE_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/perlthoughts/neural-chat-v3-3-8x7b-MoE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">perlthoughts/neural-chat-v3-3-8x7b-MoE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__neural-chat-v3-3-8x7b-MoE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
perlthoughts/neural-chat-v3-3-8x7b-MoE
ef354e7938f1c38bb1f73f4ee9a7f325ae32fc2e
71.170858
apache-2.0
0
46
true
true
false
true
2023-12-17T22:51:42Z
false
66.638225
85.431189
62.217683
63.196561
79.715864
69.825625
false
perlthoughts_openchat-3.5-1210-32k_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/perlthoughts/openchat-3.5-1210-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">perlthoughts/openchat-3.5-1210-32k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__openchat-3.5-1210-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
perlthoughts/openchat-3.5-1210-32k
48fde7a1a1d644f603a828839047ff695165b387
64.490012
apache-2.0
3
7
true
true
true
true
2023-12-30T20:31:58Z
false
64.675768
84.056961
61.587005
49.314426
79.163378
48.142532
false
perlthoughts_openchat-3.5-1210-32k-8x7b-MoE_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/perlthoughts/openchat-3.5-1210-32k-8x7b-MoE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">perlthoughts/openchat-3.5-1210-32k-8x7b-MoE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__openchat-3.5-1210-32k-8x7b-MoE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
perlthoughts/openchat-3.5-1210-32k-8x7b-MoE
c24bf500da78e987197055e96dda0dcc496de9ed
64.479464
0
46
true
true
false
true
2023-12-30T20:32:19Z
false
64.590444
84.066919
61.596264
49.317248
79.163378
48.142532
false
phanerozoic_Tiny-Cowboy-1.1b-v0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/phanerozoic/Tiny-Cowboy-1.1b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">phanerozoic/Tiny-Cowboy-1.1b-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_phanerozoic__Tiny-Cowboy-1.1b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
phanerozoic/Tiny-Cowboy-1.1b-v0.1
767476e8853f2a632b60743f54b7baf39661ceaa
36.39784
cc-by-nc-4.0
0
1
true
true
true
true
2024-04-03T09:05:48Z
false
36.177474
60.047799
24.203309
36.151572
60.063141
1.743745
false
phanerozoic_Tiny-Knight-1.1b-v0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/phanerozoic/Tiny-Knight-1.1b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">phanerozoic/Tiny-Knight-1.1b-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_phanerozoic__Tiny-Knight-1.1b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
phanerozoic/Tiny-Knight-1.1b-v0.1
c083528250610f304d1a13f2641eedd1e2c9b9c3
36.485138
cc-by-nc-4.0
0
1
true
true
true
true
2024-04-03T09:06:53Z
false
35.580205
59.121689
24.666508
37.845585
59.194949
2.501895
false
phanerozoic_Tiny-Pirate-1.1b-v0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/phanerozoic/Tiny-Pirate-1.1b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">phanerozoic/Tiny-Pirate-1.1b-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_phanerozoic__Tiny-Pirate-1.1b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
phanerozoic/Tiny-Pirate-1.1b-v0.1
64882fe6a44b724cc407bec17206b7d3f359089e
36.719176
cc-by-nc-4.0
0
1
true
true
true
true
2024-04-03T09:04:50Z
false
36.945392
60.167297
24.215575
35.838155
61.404893
1.743745
false
pharaouk_fusedyi_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pharaouk/fusedyi" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pharaouk/fusedyi</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pharaouk__fusedyi" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pharaouk/fusedyi
5e3fdfa75a3bebd5d18d25e3bada1da27f200fd6
53.182242
apache-2.0
0
10
true
true
true
true
2024-02-02T20:41:50Z
false
55.03413
76.598287
63.428567
49.294065
72.691397
2.047005
false
picAIso_TARS-8B_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/picAIso/TARS-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">picAIso/TARS-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_picAIso__TARS-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
picAIso/TARS-8B
34e8ed6b488f5d89648d80a507962bbb0aec03cd
70.643572
llama3
0
8
true
false
true
true
2024-05-31T13:12:52Z
false
64.846416
84.285999
66.938444
59.085022
77.742699
70.962851
false
pillowtalks-ai_delta13b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pillowtalks-ai/delta13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pillowtalks-ai/delta13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pillowtalks-ai__delta13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pillowtalks-ai/delta13b
83fa0860990df1db35550f973ba4306449e35412
53.285374
1
13
false
true
true
true
2023-10-16T12:46:18Z
false
52.730375
80.13344
51.936148
52.07837
74.191002
8.642911
false
pinkyponky_Mistral-7B-Instruct-Sft-Tuned-V0.2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/pinkyponky/Mistral-7B-Instruct-Sft-Tuned-V0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pinkyponky/Mistral-7B-Instruct-Sft-Tuned-V0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pinkyponky__Mistral-7B-Instruct-Sft-Tuned-V0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pinkyponky/Mistral-7B-Instruct-Sft-Tuned-V0.2
826783eb0e7f2fc471ab9dfeea59acd112a6ecc3
61.082699
0
7
false
true
true
true
2024-01-17T01:22:31Z
false
57.337884
78.948417
57.898574
50.658145
76.164167
45.489007
false
pinkyponky_Mistral-7B-Instruct-sft-tuned-v0.2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/pinkyponky/Mistral-7B-Instruct-sft-tuned-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pinkyponky/Mistral-7B-Instruct-sft-tuned-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pinkyponky__Mistral-7B-Instruct-sft-tuned-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pinkyponky/Mistral-7B-Instruct-sft-tuned-v0.2
26b1b06ca6ee8db77d915e0ec685b3e999a226d0
62.28649
0
7
false
true
true
true
2024-01-16T17:44:42Z
false
58.020478
79.25712
58.780068
50.445601
76.874507
50.341168
false
pinkyponky_Mistral-7b-instruct-v0.2-summ-sft-e1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pinkyponky__Mistral-7b-instruct-v0.2-summ-sft-e1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e1
cb20f22f421052e1ca8ea8bd9974fade5ccdfa9d
64.580499
cc-by-nc-4.0
0
7
false
true
true
true
2024-01-21T04:00:39Z
false
60.836177
83.369847
60.856101
64.979109
77.03236
40.409401
false
pinkyponky_Mistral-7b-instruct-v0.2-summ-sft-e1_4bit
4bit
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pinkyponky__Mistral-7b-instruct-v0.2-summ-sft-e1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e1
cb20f22f421052e1ca8ea8bd9974fade5ccdfa9d
63.277642
cc-by-nc-4.0
0
7
false
true
true
true
2024-01-21T04:00:55Z
false
60.153584
82.593109
58.923222
63.128774
77.111287
37.755876
false
pinkyponky_Mistral-7b-instruct-v0.2-summ-sft-e2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pinkyponky__Mistral-7b-instruct-v0.2-summ-sft-e2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e2
01a73ccd10a275738304c695d0728a29e8586f47
64.666857
cc-by-nc-4.0
0
7
false
true
true
true
2024-01-21T04:06:28Z
false
61.433447
83.638717
61.031578
64.922976
76.716654
40.257771
false
pinkyponky_Mistral-7b-instruct-v0.2-summ-sft-e2_4bit
4bit
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pinkyponky__Mistral-7b-instruct-v0.2-summ-sft-e2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e2
01a73ccd10a275738304c695d0728a29e8586f47
63.064262
cc-by-nc-4.0
0
7
false
true
true
true
2024-01-21T04:06:43Z
false
59.47099
82.722565
59.481065
62.696422
76.637727
37.376801
false
pinkyponky_Mistral-7b-instruct-v0.2-summ-sft-e3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pinkyponky__Mistral-7b-instruct-v0.2-summ-sft-e3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e3
5b09e3dd2bf8bcf08b9b3dd0d69e4cc67d782fd3
64.535839
cc-by-nc-4.0
0
7
false
true
true
true
2024-01-21T04:07:05Z
false
61.177474
83.718383
60.927692
64.938431
76.953433
39.499621
false
pinkyponky_Mistral-7b-instruct-v0.2-summ-sft-e3_4bit
4bit
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pinkyponky__Mistral-7b-instruct-v0.2-summ-sft-e3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pinkyponky/Mistral-7b-instruct-v0.2-summ-sft-e3
5b09e3dd2bf8bcf08b9b3dd0d69e4cc67d782fd3
63.141044
cc-by-nc-4.0
0
7
false
true
true
true
2024-01-21T04:07:11Z
false
59.982935
82.762398
59.481867
62.999169
76.243094
37.376801
false
pinkyponky_SOLAR-10.7B-dpo-instruct-tuned-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pinkyponky/SOLAR-10.7B-dpo-instruct-tuned-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pinkyponky/SOLAR-10.7B-dpo-instruct-tuned-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pinkyponky__SOLAR-10.7B-dpo-instruct-tuned-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pinkyponky/SOLAR-10.7B-dpo-instruct-tuned-v0.1
bb3b052f07ab6bc00a03dc5c7b510c0760bfd650
68.67882
cc-by-nc-4.0
0
10
true
true
true
true
2024-01-10T15:40:52Z
false
65.187713
86.088429
66.25287
51.809374
83.977901
58.756634
false
playdev7_theseed-v0.3_4bit
4bit
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/playdev7/theseed-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">playdev7/theseed-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_playdev7__theseed-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
playdev7/theseed-v0.3
545fd9e47d92b243c42b521a64596f114c961b3f
29.239106
mit
0
24
true
true
true
true
2024-03-21T07:52:33Z
false
25.938567
26.050588
24.554052
46.326315
52.565114
0
false
player1537_dolphinette_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/player1537/dolphinette" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">player1537/dolphinette</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_player1537__dolphinette" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
player1537/dolphinette
20529d47b0a82343014727edd1639a9a6a6b09e6
30.653589
0
0
false
true
true
true
2023-10-16T12:46:18Z
false
24.914676
37.333201
25.37447
42.076614
54.222573
0
false
pleisto_yuren-13b-chatml_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pleisto/yuren-13b-chatml" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pleisto/yuren-13b-chatml</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pleisto__yuren-13b-chatml" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pleisto/yuren-13b-chatml
d9479c8c554ef335b5fd5b9a2e328de03c35d50e
55.387029
llama2
3
13
true
true
true
true
2024-02-02T12:51:45Z
false
53.071672
78.032264
56.338774
42.324312
74.427782
28.127369
false
pmking27_PrathameshLLM-2B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/pmking27/PrathameshLLM-2B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pmking27/PrathameshLLM-2B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pmking27__PrathameshLLM-2B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pmking27/PrathameshLLM-2B
8e708a65f07d0842ffd1fd81bc0cc14d1d81686a
45.061967
apache-2.0
2
2
true
true
true
true
2024-04-07T14:21:58Z
false
44.709898
68.402709
38.20543
44.689893
65.114444
9.249431
false
pmking27_PrathameshLLM-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/pmking27/PrathameshLLM-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pmking27/PrathameshLLM-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pmking27__PrathameshLLM-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pmking27/PrathameshLLM-7B
f93a2a648522f274e7178512c4561ab1769dd5e0
45.061967
0
2
false
true
true
true
2024-04-02T08:01:30Z
false
44.709898
68.402709
38.20543
44.689893
65.114444
9.249431
false
pmking27_PrathameshLLM-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/pmking27/PrathameshLLM-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pmking27/PrathameshLLM-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_pmking27__PrathameshLLM-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pmking27/PrathameshLLM-7B
f93a2a648522f274e7178512c4561ab1769dd5e0
45.178656
0
2
false
true
true
true
2024-04-03T05:17:37Z
false
44.96587
68.283211
38.24603
44.748725
65.351223
9.476876
false
porkorbeef_Llama-2-13b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/porkorbeef/Llama-2-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">porkorbeef/Llama-2-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_porkorbeef__Llama-2-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
porkorbeef/Llama-2-13b
06253ee259e6b205c4734ab6ec3fa850737b2110
30.109427
0
12
false
true
true
true
2023-10-16T12:48:18Z
false
29.351536
26.349333
24.94171
48.317062
51.696922
0
false
porkorbeef_Llama-2-13b-12_153950_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/porkorbeef/Llama-2-13b-12_153950" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">porkorbeef/Llama-2-13b-12_153950</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_porkorbeef__Llama-2-13b-12_153950" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
porkorbeef/Llama-2-13b-12_153950
ee9b0cf26f521b5cb2322d743880e8b6bfadb0b7
29.682996
cc-by-nc-4.0
0
12
false
true
true
true
2023-10-16T12:48:18Z
false
28.583618
26.578371
20.785545
49.032842
53.117601
0
false
porkorbeef_Llama-2-13b-public_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/porkorbeef/Llama-2-13b-public" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">porkorbeef/Llama-2-13b-public</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_porkorbeef__Llama-2-13b-public" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
porkorbeef/Llama-2-13b-public
e1b32a8fcfc0f37fd5f50cf765151897574c73c7
29.454141
0
12
false
true
true
true
2023-09-09T10:52:17Z
false
29.948805
26.648078
22.735861
49.010098
48.382005
0
false
porkorbeef_Llama-2-13b-sf_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/porkorbeef/Llama-2-13b-sf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">porkorbeef/Llama-2-13b-sf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_porkorbeef__Llama-2-13b-sf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
porkorbeef/Llama-2-13b-sf
06253ee259e6b205c4734ab6ec3fa850737b2110
30.219722
cc-by-nc-4.0
0
12
false
true
true
true
2023-09-09T10:52:17Z
false
29.522184
26.488747
25.983866
48.968364
50.35517
0
false
posicube_Llama-chat-AY-13B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/posicube/Llama-chat-AY-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">posicube/Llama-chat-AY-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_posicube__Llama-chat-AY-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
posicube/Llama-chat-AY-13B
66037b5ee553f7b878d796d2b2d5ada5734cc164
58.341149
llama2
0
13
false
true
true
true
2023-10-16T12:48:18Z
false
62.798635
83.230432
60.013437
55.9466
75.927388
12.130402
false
posicube_Llama2-chat-AYB-13B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/posicube/Llama2-chat-AYB-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">posicube/Llama2-chat-AYB-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_posicube__Llama2-chat-AYB-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
posicube/Llama2-chat-AYB-13B
cc7ca1b8f906b9f62ace094540f4ff4124dd581a
58.447855
llama2
14
13
true
true
true
true
2023-10-16T12:48:18Z
false
63.395904
84.793866
59.336612
55.621217
76.243094
11.296437
false
posicube_Llama2-chat-AYT-13B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/posicube/Llama2-chat-AYT-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">posicube/Llama2-chat-AYT-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_posicube__Llama2-chat-AYT-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
posicube/Llama2-chat-AYT-13B
dd12dced8076a959c03b8b5c4a4266f234d6639a
57.878088
llama2
17
13
true
true
true
true
2023-10-16T12:54:17Z
false
63.31058
83.529177
59.674496
55.798676
76.085241
8.870356
false
postbot_distilgpt2-emailgen_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/postbot/distilgpt2-emailgen" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">postbot/distilgpt2-emailgen</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_postbot__distilgpt2-emailgen" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
postbot/distilgpt2-emailgen
fe96d63cc2edcbd1ae444ada293cc59d1e01a6ad
28.840249
apache-2.0
3
0
true
true
true
true
2023-10-20T09:10:19Z
false
21.757679
27.524398
25.971142
46.170278
51.617995
0
false
postbot_distilgpt2-emailgen-V2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/postbot/distilgpt2-emailgen-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">postbot/distilgpt2-emailgen-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_postbot__distilgpt2-emailgen-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
postbot/distilgpt2-emailgen-V2
9750ba00e79a02e1bf98d3faa3d49b8ae0f8ae63
28.637027
apache-2.0
2
0
true
true
true
true
2023-10-20T09:10:29Z
false
20.989761
26.777534
25.529043
46.513197
52.012628
0
false
postbot_emailgen-pythia-410m-deduped_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/postbot/emailgen-pythia-410m-deduped" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">postbot/emailgen-pythia-410m-deduped</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_postbot__emailgen-pythia-410m-deduped" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
postbot/emailgen-pythia-410m-deduped
e0208b02990c49138350da791f0b6fcb8a65e738
30.931134
apache-2.0
0
0
true
true
true
true
2023-10-20T09:11:11Z
false
27.901024
40.041824
27.354978
38.197425
52.091555
0
false
postbot_gpt-neo-1.3B-emailgen_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoForCausalLM
<a target="_blank" href="https://huggingface.co/postbot/gpt-neo-1.3B-emailgen" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">postbot/gpt-neo-1.3B-emailgen</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_postbot__gpt-neo-1.3B-emailgen" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
postbot/gpt-neo-1.3B-emailgen
accdf0e43c0d1b313bc6d1fb307d67f1921ef3ca
33.471901
apache-2.0
2
1
true
true
true
true
2024-01-10T16:07:23Z
false
29.948805
47.948616
24.11124
42.548079
56.274665
0
false
postbot_gpt2-medium-emailgen_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/postbot/gpt2-medium-emailgen" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">postbot/gpt2-medium-emailgen</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_postbot__gpt2-medium-emailgen" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
postbot/gpt2-medium-emailgen
1b9b03d00b2b300d3c04c37fe3782c180ef51a27
29.874906
apache-2.0
3
0
true
true
true
true
2023-11-18T15:04:15Z
false
26.450512
34.305915
24.102869
43.956041
50.434096
0
false
postbot_pythia-160m-hq-emails_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/postbot/pythia-160m-hq-emails" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">postbot/pythia-160m-hq-emails</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_postbot__pythia-160m-hq-emails" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
postbot/pythia-160m-hq-emails
6eeded627780b47b5221ed72ebea436514621964
29.255738
apache-2.0
1
0
true
true
true
true
2023-11-18T15:04:41Z
false
23.122867
30.053774
26.576247
45.5053
50.276243
0
false
postitive666_llama3_ruozhiba_8b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/postitive666/llama3_ruozhiba_8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">postitive666/llama3_ruozhiba_8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_postitive666__llama3_ruozhiba_8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
postitive666/llama3_ruozhiba_8b
5c1e0c5882b5d488df7fcb2cd77cd26e93e56856
66.646838
apache-2.0
1
8
true
true
true
true
2024-04-19T15:20:47Z
false
60.580205
78.848835
66.797704
49.966948
75.453828
68.23351
false
ppopiolek_tinyllama_eng_long_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ppopiolek/tinyllama_eng_long" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ppopiolek/tinyllama_eng_long</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ppopiolek__tinyllama_eng_long" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ppopiolek/tinyllama_eng_long
877156c3b157e1fe1d10d4800169120f2a7e8fd1
37.278263
apache-2.0
0
0
true
true
true
true
2024-04-26T11:47:52Z
false
36.604096
61.880104
26.300089
35.875834
61.7206
1.288855
false
ppopiolek_tinyllama_eng_short_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ppopiolek/tinyllama_eng_short" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ppopiolek/tinyllama_eng_short</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ppopiolek__tinyllama_eng_short" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ppopiolek/tinyllama_eng_short
34b2f49ad0e13eaebfc9296e02af6dba1efc691e
36.949606
apache-2.0
0
0
true
true
true
true
2024-04-25T19:16:05Z
false
36.433447
60.963951
25.314723
37.393628
60.378848
1.21304
false
ppopiolek_tinyllama_eng_short1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ppopiolek/tinyllama_eng_short1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ppopiolek/tinyllama_eng_short1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ppopiolek__tinyllama_eng_short1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ppopiolek/tinyllama_eng_short1
ff195734d2552ba010c24f879d424e693b70ce1e
36.949606
0
1
false
true
true
true
2024-04-25T19:15:55Z
false
36.433447
60.963951
25.314723
37.393628
60.378848
1.21304
false
ppopiolek_tinyllama_merged_s_500_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ppopiolek/tinyllama_merged_s_500" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ppopiolek/tinyllama_merged_s_500</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ppopiolek__tinyllama_merged_s_500" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ppopiolek/tinyllama_merged_s_500
9f6b35978a0c0a2a458f0d0d142162fc6eb4f379
37.048663
apache-2.0
0
1
true
true
true
true
2024-04-20T20:13:06Z
false
36.177474
61.322446
25.858911
35.718028
61.168114
2.047005
false
ppopiolek_tinyllama_merged_test_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ppopiolek/tinyllama_merged_test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ppopiolek/tinyllama_merged_test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ppopiolek__tinyllama_merged_test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ppopiolek/tinyllama_merged_test
596908fcf61290279be1d614db31ba2eb15d76a6
37.6813
apache-2.0
0
1
true
true
true
true
2024-04-18T15:52:09Z
false
37.201365
61.322446
25.701747
38.719826
61.24704
1.895375
false
prhegde_aligned-merge-aanaphi-phi2-orage-3b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/prhegde/aligned-merge-aanaphi-phi2-orage-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prhegde/aligned-merge-aanaphi-phi2-orage-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_prhegde__aligned-merge-aanaphi-phi2-orage-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prhegde/aligned-merge-aanaphi-phi2-orage-3b
e48401ef3061df10289ae52666c0c8a455792470
64.833895
apache-2.0
0
2
true
false
true
true
2024-05-18T14:25:35Z
false
63.737201
77.365067
58.079796
53.506695
74.980268
61.334344
false
prhegde_merge-aanaphi-phi2-orage-3b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/prhegde/merge-aanaphi-phi2-orage-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prhegde/merge-aanaphi-phi2-orage-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_prhegde__merge-aanaphi-phi2-orage-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prhegde/merge-aanaphi-phi2-orage-3b
6396a99299f440c7d7ec93786d7874a49accce7e
64.870404
mit
2
2
true
false
true
true
2024-03-26T04:25:02Z
false
63.566553
77.424816
58.214602
53.474397
74.980268
61.561789
false
prince-canuma_Damysus-2.7B-Chat_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/prince-canuma/Damysus-2.7B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prince-canuma/Damysus-2.7B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_prince-canuma__Damysus-2.7B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prince-canuma/Damysus-2.7B-Chat
d805640fae5928607626d5c89b66a9aaf98da752
60.492246
mit
4
2
false
true
true
true
2024-02-11T17:04:49Z
false
59.812287
74.517028
56.33342
46.744973
74.901342
50.644428
false
prince-canuma_Damysus-2.7B-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/prince-canuma/Damysus-2.7B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prince-canuma/Damysus-2.7B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_prince-canuma__Damysus-2.7B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prince-canuma/Damysus-2.7B-Chat
d805640fae5928607626d5c89b66a9aaf98da752
60.254076
mit
4
2
false
true
true
true
2024-02-11T17:24:10Z
false
59.129693
74.357698
56.342769
46.445563
75.059195
50.189538
false
prince-canuma_Damysus-Coder-v0.1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/prince-canuma/Damysus-Coder-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prince-canuma/Damysus-Coder-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_prince-canuma__Damysus-Coder-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prince-canuma/Damysus-Coder-v0.1
dc898dc853ab93f6b9bdae507bcac0597f43f7c4
64.341993
apache-2.0
0
7
true
true
true
true
2024-04-13T09:42:56Z
false
60.921502
84.00717
60.540629
64.199193
77.111287
39.272176
false
prince-canuma_Llama-3-6B-v0.1_float16
float16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prince-canuma/Llama-3-6B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prince-canuma/Llama-3-6B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_prince-canuma__Llama-3-6B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prince-canuma/Llama-3-6B-v0.1
515d083eb961a974ca1484147745c14edc0ced5c
54.139662
llama3
9
6
true
true
true
true
2024-05-20T15:25:32Z
false
48.208191
74.666401
61.842494
43.004609
73.007103
24.109174
false
prince-canuma_im-a-good-llama3-step-46k_float16
float16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prince-canuma/im-a-good-llama3-step-46k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prince-canuma/im-a-good-llama3-step-46k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_prince-canuma__im-a-good-llama3-step-46k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prince-canuma/im-a-good-llama3-step-46k
97eb10de9b6092e1900dc124ba6889b573df6c7b
54.177699
0
6
false
true
true
true
2024-05-17T18:11:45Z
false
48.208191
74.666401
61.855461
43.001755
72.770324
24.564064
false
princeton-nlp_Sheared-LLaMA-1.3B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Sheared-LLaMA-1.3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Sheared-LLaMA-1.3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-1.3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Sheared-LLaMA-1.3B
b1c3f74c8495e27b3963d64af0781d4a611794f3
35.951575
apache-2.0
88
1
true
true
true
true
2023-10-16T12:48:18Z
false
32.849829
60.914161
25.705062
37.143045
58.642463
0.45489
false
princeton-nlp_Sheared-LLaMA-1.3B-ShareGPT_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Sheared-LLaMA-1.3B-ShareGPT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Sheared-LLaMA-1.3B-ShareGPT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-1.3B-ShareGPT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Sheared-LLaMA-1.3B-ShareGPT
d2f3cfae7746c4ff07353b39828985ea0f36b07d
37.143439
apache-2.0
9
1
true
true
true
true
2024-01-07T23:08:34Z
false
33.959044
62.547301
26.416939
43.034384
56.827151
0.075815
false
princeton-nlp_Sheared-LLaMA-2.7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Sheared-LLaMA-2.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Sheared-LLaMA-2.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-2.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Sheared-LLaMA-2.7B
16347024c4df6cd114720958964a850fc287cac0
40.8412
apache-2.0
57
2
true
true
true
true
2023-10-16T12:48:18Z
false
41.723549
71.011751
26.922072
37.319739
67.008682
1.06141
false
princeton-nlp_Sheared-LLaMA-2.7B-ShareGPT_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Sheared-LLaMA-2.7B-ShareGPT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Sheared-LLaMA-2.7B-ShareGPT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-2.7B-ShareGPT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Sheared-LLaMA-2.7B-ShareGPT
802be8903ec44f49a883915882868b479ecdcc3b
42.114322
apache-2.0
7
2
true
true
true
true
2024-01-05T11:14:19Z
false
41.040956
71.260705
28.503023
47.713924
64.167324
0
false
princeton-nlp_Sheared-Pythia-160m_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/princeton-nlp/Sheared-Pythia-160m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">princeton-nlp/Sheared-Pythia-160m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_princeton-nlp__Sheared-Pythia-160m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
princeton-nlp/Sheared-Pythia-160m
c8889f496254bae7b6196dfd64521e1581eb5567
29.409592
apache-2.0
4
0
true
true
true
true
2024-03-05T11:09:42Z
false
22.440273
32.065326
26.651402
43.224553
51.696922
0.379075
false
prithivida_Asimov-7B-v1_4bit
4bit
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/prithivida/Asimov-7B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivida/Asimov-7B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_prithivida__Asimov-7B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivida/Asimov-7B-v1
0b33ad0a6dde60156ee6008ff47f7cfa6cd27937
54.977239
mit
1
7
true
true
true
true
2023-11-17T13:32:19Z
false
59.044369
80.043816
56.348227
51.147554
73.954223
9.325246
false