eval_name
stringlengths
9
97
Precision
stringclasses
5 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
3 values
Architecture
stringclasses
53 values
Model
stringlengths
355
611
fullname
stringlengths
4
89
Model sha
stringlengths
0
40
Average ⬆️
float64
27
81.3
Hub License
stringclasses
35 values
Hub ❤️
int64
0
4.88k
#Params (B)
int64
0
238
Available on the hub
bool
2 classes
Merged
bool
2 classes
MoE
bool
2 classes
Flagged
bool
1 class
date
stringlengths
0
26
Chat Template
bool
2 classes
ARC
float64
19.7
87.5
HellaSwag
float64
20.7
92.8
MMLU
float64
17.8
89.4
TruthfulQA
float64
27.9
82.3
Winogrande
float64
47.2
91.5
GSM8K
float64
0
88.2
Maintainers Choice
bool
2 classes
adamo1139_Yi-34B-AEZAKMI-v1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/adamo1139/Yi-34B-AEZAKMI-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">adamo1139/Yi-34B-AEZAKMI-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__Yi-34B-AEZAKMI-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
adamo1139/Yi-34B-AEZAKMI-v1
c56dc8471eba802f74fed756f555b718d975d00a
68.671262
apache-2.0
2
34
true
true
true
true
2023-12-01T00:16:35Z
false
64.334471
84.305915
73.913395
55.734077
80.820837
52.918878
false
adamo1139_Yi-34b-200K-rawrr-v2-run-0902-LoRA_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/adamo1139/Yi-34b-200K-rawrr-v2-run-0902-LoRA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">adamo1139/Yi-34b-200K-rawrr-v2-run-0902-LoRA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__Yi-34b-200K-rawrr-v2-run-0902-LoRA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
adamo1139/Yi-34b-200K-rawrr-v2-run-0902-LoRA
3fbaa2965a16992f1e8cddbc0c9b40efd6f15698
69.148869
apache-2.0
0
34
false
true
true
true
2024-02-09T21:35:08Z
false
64.675768
84.49512
75.755716
46.661756
81.136543
62.168309
false
adamo1139_Yi-6B-200K-AEZAKMI-v2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/adamo1139/Yi-6B-200K-AEZAKMI-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">adamo1139/Yi-6B-200K-AEZAKMI-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__Yi-6B-200K-AEZAKMI-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
adamo1139/Yi-6B-200K-AEZAKMI-v2
0c4dd0e7119bbef9fa5b28b5a581b60822cebaf5
54.926485
apache-2.0
1
6
true
true
true
true
2024-01-10T21:41:17Z
false
52.986348
71.200956
63.003111
46.792273
70.481452
25.094769
false
adamo1139_Yi-6B-200K-AEZAKMI-v2-rawrr1-DPO_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/adamo1139/Yi-6B-200K-AEZAKMI-v2-rawrr1-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">adamo1139/Yi-6B-200K-AEZAKMI-v2-rawrr1-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__Yi-6B-200K-AEZAKMI-v2-rawrr1-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
adamo1139/Yi-6B-200K-AEZAKMI-v2-rawrr1-DPO
9271df80f5221362cb5ffd71f463f8f8d08c31dc
56.196546
apache-2.0
0
6
true
true
true
true
2024-01-10T21:42:49Z
false
52.474403
77.036447
62.572621
47.147535
71.033938
26.914329
false
adamo1139_yi-34b-200k-rawrr-dpo-1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/adamo1139/yi-34b-200k-rawrr-dpo-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">adamo1139/yi-34b-200k-rawrr-dpo-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__yi-34b-200k-rawrr-dpo-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
adamo1139/yi-34b-200k-rawrr-dpo-1
2f6396382239da8aa2858393c62f0c5596bd09f0
70.967043
apache-2.0
2
34
true
true
true
true
2024-01-16T08:49:23Z
false
65.443686
85.690102
76.087196
53.998037
82.794002
61.789234
false
adamo1139_yi-34b-200k-rawrr-dpo-2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/adamo1139/yi-34b-200k-rawrr-dpo-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">adamo1139/yi-34b-200k-rawrr-dpo-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__yi-34b-200k-rawrr-dpo-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
adamo1139/yi-34b-200k-rawrr-dpo-2
6682e3f76d02f280c4a265c9192c5a9e117cfdd4
69.417875
apache-2.0
2
34
true
true
true
true
2024-01-27T12:25:29Z
false
64.675768
84.744075
75.95718
46.152359
83.188635
61.789234
false
adonlee_LLaMA_2_13B_SFT_v0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/adonlee/LLaMA_2_13B_SFT_v0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">adonlee/LLaMA_2_13B_SFT_v0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_adonlee__LLaMA_2_13B_SFT_v0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
adonlee/LLaMA_2_13B_SFT_v0
a6790d83337578f38d2bcd51038a779eaa8d0fac
57.306441
apache-2.0
0
13
true
true
true
true
2023-10-16T12:46:18Z
false
62.030717
83.798048
58.389526
49.917554
77.26914
12.433662
false
adonlee_LLaMA_2_13B_SFT_v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/adonlee/LLaMA_2_13B_SFT_v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">adonlee/LLaMA_2_13B_SFT_v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_adonlee__LLaMA_2_13B_SFT_v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
adonlee/LLaMA_2_13B_SFT_v1
31421b19a3f5fe2eff4871c86d3a94d5723b6fd2
63.042013
apache-2.0
2
13
true
true
true
true
2023-11-16T03:09:17Z
false
64.505119
83.379805
58.603134
53.201728
78.531965
40.030326
false
adonlee_Mistral_7B_SFT_DPO_v0_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/adonlee/Mistral_7B_SFT_DPO_v0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">adonlee/Mistral_7B_SFT_DPO_v0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_adonlee__Mistral_7B_SFT_DPO_v0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
adonlee/Mistral_7B_SFT_DPO_v0
03955e2748064dcfac121e35e4e060cf6f48e259
72.17144
apache-2.0
1
7
true
true
true
true
2024-02-05T02:45:20Z
false
66.296928
84.903406
64.534681
69.71824
81.767956
65.80743
false
adowu_autocodit_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/adowu/autocodit" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">adowu/autocodit</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_adowu__autocodit" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
adowu/autocodit
ce275915c13aa3d1c037b89e9e138c089c43ab5a
69.568698
apache-2.0
0
7
true
true
true
true
2024-04-09T18:51:17Z
false
66.382253
84.82374
65.094537
59.954522
80.50513
60.652009
false
aeonium_Aeonium-v1-BaseWeb-1B_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/aeonium/Aeonium-v1-BaseWeb-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aeonium/Aeonium-v1-BaseWeb-1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_aeonium__Aeonium-v1-BaseWeb-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aeonium/Aeonium-v1-BaseWeb-1B
a9ca512e03f53a9229cffb5e46727104bf4b0f8e
29.147435
0
1
false
true
true
true
2024-05-06T08:20:18Z
false
20.989761
32.244573
26.152533
46.168617
49.329124
0
false
aerdincdal_CBDDO-LLM-8B-Instruct-v0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/aerdincdal/CBDDO-LLM-8B-Instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aerdincdal/CBDDO-LLM-8B-Instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_aerdincdal__CBDDO-LLM-8B-Instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aerdincdal/CBDDO-LLM-8B-Instruct-v0.1
a831ecff981368538f8145518df7e3a2160ee65d
64.171582
0
8
false
true
true
true
2024-04-30T08:09:59Z
false
57.423208
75.403306
65.737866
51.209934
76.953433
58.301744
false
aerdincdal_CBDDO-LLM-8B-Instruct-v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/aerdincdal/CBDDO-LLM-8B-Instruct-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aerdincdal/CBDDO-LLM-8B-Instruct-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_aerdincdal__CBDDO-LLM-8B-Instruct-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aerdincdal/CBDDO-LLM-8B-Instruct-v1
84430552036c85cc6a16722b26496df4d93f3afe
56.935298
mit
1
8
true
true
true
true
2024-05-02T09:04:46Z
false
54.607509
76.339375
60.619659
50.816621
74.664562
24.564064
false
aevalone_Pengland-Merge_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/aevalone/Pengland-Merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aevalone/Pengland-Merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_aevalone__Pengland-Merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aevalone/Pengland-Merge
404bfbd322f0f5168d23a1ba8dff85e46d971db2
40.716484
0
8
false
true
true
true
2024-01-24T16:19:57Z
false
40.52901
47.062338
50.719214
47.03017
58.958169
0
false
aevalone_Test-7B-pthrough_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/aevalone/Test-7B-pthrough" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aevalone/Test-7B-pthrough</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_aevalone__Test-7B-pthrough" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aevalone/Test-7B-pthrough
67127c0796b2c49f86f68ebb10e6a5707e0d59cf
42.467152
0
8
false
true
true
true
2024-01-24T15:49:38Z
false
44.368601
51.194981
49.309726
48.574496
60.142068
1.21304
false
ahnyeonchan_OpenOrca-AYT-13B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Delta
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ahnyeonchan/OpenOrca-AYT-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ahnyeonchan/OpenOrca-AYT-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ahnyeonchan__OpenOrca-AYT-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ahnyeonchan/OpenOrca-AYT-13B
1357abceda30e8389007a023907824cc3a11e397
null
llama2
0
13
false
true
true
true
2023-11-06T10:31:15Z
false
27.21843
26.030671
25.113111
null
49.723757
0
false
ahxt_LiteLlama-460M-1T_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ahxt/LiteLlama-460M-1T" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ahxt/LiteLlama-460M-1T</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ahxt__LiteLlama-460M-1T" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ahxt/LiteLlama-460M-1T
77b8a976440e7d1ea5a890eaf1e0175b1cac0078
30.161396
mit
161
0
true
true
true
true
2024-01-10T15:13:00Z
false
24.829352
38.388767
25.959187
41.593753
50.197316
0
false
ahxt_llama2_xs_460M_experimental_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ahxt/llama2_xs_460M_experimental" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ahxt/llama2_xs_460M_experimental</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ahxt__llama2_xs_460M_experimental" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ahxt/llama2_xs_460M_experimental
c8db281477559f5c969a9be794ce236f8a99e1a0
30.171082
null
12
0
false
true
true
true
2023-10-16T12:48:18Z
false
24.914676
38.468433
26.17031
41.591462
49.88161
0
false
ai-business_Luban-13B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/ai-business/Luban-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ai-business/Luban-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ai-business__Luban-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ai-business/Luban-13B
01b0f2046083dd8d9d8f9e626d78d83eaa1d57dd
57.729478
0
12
false
true
true
true
2023-10-16T12:48:18Z
false
63.054608
82.802231
58.726676
55.53023
76.5588
9.704321
false
ai-forever_mGPT_float16
float16
🟢 pretrained
🟢
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/ai-forever/mGPT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ai-forever/mGPT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ai-forever__mGPT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ai-forever/mGPT
40897bd7c8b47a76802c411108ca6220438b8b40
27.605187
apache-2.0
235
0
true
true
true
true
2023-12-29T19:37:30Z
false
23.805461
26.369249
25.168905
39.616632
50.670876
0
false
ai-forever_rugpt3large_based_on_gpt2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/ai-forever/rugpt3large_based_on_gpt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ai-forever/rugpt3large_based_on_gpt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ai-forever__rugpt3large_based_on_gpt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ai-forever/rugpt3large_based_on_gpt2
8201db0de8deb68f25e7309db04d163b71970494
29.526564
null
70
0
false
true
true
true
2023-10-16T13:00:29Z
false
22.610922
32.842063
24.897381
43.388159
53.117601
0.30326
false
ai4bharat_Airavata_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ai4bharat/Airavata" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ai4bharat/Airavata</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ai4bharat__Airavata" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ai4bharat/Airavata
3fd8340a3683c8e7695c89a463428fcc0b2a875a
45.520612
llama2
26
6
true
true
true
true
2024-01-26T02:42:39Z
false
46.501706
69.259112
43.901763
40.6189
68.823994
4.018196
false
aihub-app_ZySec-7B-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/aihub-app/ZySec-7B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aihub-app/ZySec-7B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_aihub-app__ZySec-7B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aihub-app/ZySec-7B-v1
458f098e529e7ec670a02cc7b75a1a74496984a8
61.077922
0
7
false
true
true
true
2024-01-28T11:58:34Z
false
63.481229
85.012946
60.143404
56.493227
78.137332
23.199393
false
aihub-app_ZySec-8B-v2_float16
float16
🟢 pretrained
🟢
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/aihub-app/ZySec-8B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aihub-app/ZySec-8B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_aihub-app__ZySec-8B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aihub-app/ZySec-8B-v2
9554702bbe26b1d1515e75ccb0b3549096622440
54.627193
0
8
false
true
true
true
2024-02-27T19:47:21Z
false
53.071672
76.299542
54.547327
47.047994
68.745067
28.051554
false
aihub-app_zyte-1.1B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/aihub-app/zyte-1.1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aihub-app/zyte-1.1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_aihub-app__zyte-1.1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aihub-app/zyte-1.1B
4537b28d9b2e9958c53b6d4aa6e16f46f85c1867
38.224014
0
1
false
true
true
true
2024-01-11T04:35:22Z
false
37.883959
61.372237
24.620295
42.145546
61.95738
1.36467
false
aihub-app_zyte-1.1b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/aihub-app/zyte-1.1b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aihub-app/zyte-1.1b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_aihub-app__zyte-1.1b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aihub-app/zyte-1.1b
3d4e61bc3c090a28355cceba8da106c31e3bbb84
37.696161
0
1
false
true
true
true
2024-01-10T19:14:56Z
false
37.542662
60.824537
24.572031
39.457682
62.036306
1.743745
false
aihub-app_zyte-1B_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/aihub-app/zyte-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aihub-app/zyte-1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_aihub-app__zyte-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aihub-app/zyte-1B
6c2b31ee038f8df37547c013d73b91c4a07e41a5
38.234476
0
1
false
true
true
true
2024-01-12T11:40:51Z
false
37.883959
61.372237
24.612458
42.140336
61.95738
1.440485
false
aiplanet_buddhi-128k-chat-7b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/aiplanet/buddhi-128k-chat-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aiplanet/buddhi-128k-chat-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_aiplanet__buddhi-128k-chat-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aiplanet/buddhi-128k-chat-7b
808b4230c690938f8ba2446b95e31fcb9cbeedf1
64.420879
apache-2.0
14
7
true
true
true
true
2024-04-05T07:22:09Z
false
60.836177
83.997212
60.421037
65.715127
77.26914
38.286581
false
aiplanet_effi-7b_4bit
4bit
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/aiplanet/effi-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aiplanet/effi-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_aiplanet__effi-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aiplanet/effi-7b
d58c62ee27cae60392bd0bd53e1fd05ea82e273b
47.421656
apache-2.0
3
7
true
true
true
true
2023-10-16T12:48:18Z
false
55.119454
78.072097
35.914376
39.706236
72.533544
3.18423
false
aiplanet_panda-coder-13B_4bit
4bit
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/aiplanet/panda-coder-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aiplanet/panda-coder-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_aiplanet__panda-coder-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aiplanet/panda-coder-13B
823a8320224cdac88e927aee00338ffa79395faa
null
apache-2.0
13
13
true
true
true
true
2023-11-06T10:31:15Z
false
22.696246
25.044812
23.116858
null
49.565904
0
false
airesearch_LLaMa3-8b-WangchanX-sft-Demo_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/airesearch/LLaMa3-8b-WangchanX-sft-Demo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">airesearch/LLaMa3-8b-WangchanX-sft-Demo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_airesearch__LLaMa3-8b-WangchanX-sft-Demo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
airesearch/LLaMa3-8b-WangchanX-sft-Demo
4e6641e65044ff8f735f5faf162ff0bf97065f9a
63.221507
llama3
1
8
true
true
true
true
2024-04-28T05:57:45Z
false
60.409556
83.120892
65.479668
41.050021
77.03236
52.236543
false
airesearch_PolyLM-13b-WangchanX-sft-Demo_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/airesearch/PolyLM-13b-WangchanX-sft-Demo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">airesearch/PolyLM-13b-WangchanX-sft-Demo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_airesearch__PolyLM-13b-WangchanX-sft-Demo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
airesearch/PolyLM-13b-WangchanX-sft-Demo
20008082cb0c52712ea39215ea893719ca061f06
null
cc-by-nc-3.0
0
13
true
true
true
true
2024-04-28T06:12:58Z
false
22.696246
25.044812
23.116858
null
49.565904
0
false
airesearch_typhoon-7b-WangchanX-sft-Demo_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/airesearch/typhoon-7b-WangchanX-sft-Demo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">airesearch/typhoon-7b-WangchanX-sft-Demo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_airesearch__typhoon-7b-WangchanX-sft-Demo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
airesearch/typhoon-7b-WangchanX-sft-Demo
f4fe3cc82611a40b9d3b336e91e4fb3582c3d1e6
61.170749
cc-by-nc-3.0
1
7
true
true
true
true
2024-04-28T05:57:30Z
false
58.959044
82.383987
57.67455
44.828103
76.400947
46.777862
false
aisquared_chopt-1_3b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
OPTForCausalLM
<a target="_blank" href="https://huggingface.co/aisquared/chopt-1_3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aisquared/chopt-1_3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__chopt-1_3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aisquared/chopt-1_3b
fdd3691978f557baf9d1c20d4ede900c47f7e135
35.317892
other
0
3
true
true
true
true
2023-10-16T12:48:18Z
false
31.484642
56.632145
25.350454
40.192283
58.24783
0
false
aisquared_chopt-2_7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
OPTForCausalLM
<a target="_blank" href="https://huggingface.co/aisquared/chopt-2_7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aisquared/chopt-2_7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__chopt-2_7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aisquared/chopt-2_7b
45f57352c10a1fb1ec13c4bf387a15552ca1fe65
36.718903
other
0
7
true
true
true
true
2023-09-09T10:52:17Z
false
36.006826
63.383788
25.442513
37.706019
57.77427
0
false
aisquared_dlite-v1-124m_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/aisquared/dlite-v1-124m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aisquared/dlite-v1-124m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v1-124m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aisquared/dlite-v1-124m
f6fd5f3960f31881e6cee23f5a872ecc80b40283
27.855497
apache-2.0
0
0
true
true
true
true
2023-09-09T10:52:17Z
false
24.317406
31.159132
25.08005
36.37908
50.197316
0
false
aisquared_dlite-v1-1_5b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/aisquared/dlite-v1-1_5b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aisquared/dlite-v1-1_5b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v1-1_5b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aisquared/dlite-v1-1_5b
4ac21faec255e3544e96aeb3591c27bdee5ebf45
33.347227
apache-2.0
2
5
true
true
true
true
2023-09-09T10:52:17Z
false
31.65529
49.691297
25.61721
37.084793
55.958958
0.075815
false
aisquared_dlite-v1-355m_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/aisquared/dlite-v1-355m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aisquared/dlite-v1-355m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v1-355m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aisquared/dlite-v1-355m
c5f4b5a61e6a66a5c7613164d99a70db5bf7e9a2
30.541324
apache-2.0
2
0
true
true
true
true
2023-10-16T12:54:17Z
false
27.133106
39.065923
27.121584
37.125433
52.801894
0
false
aisquared_dlite-v1-774m_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/aisquared/dlite-v1-774m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aisquared/dlite-v1-774m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v1-774m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aisquared/dlite-v1-774m
d3f5401d07965fb13c2cb8b458ffaed9a5a79c2d
31.510215
apache-2.0
0
0
true
true
true
true
2023-09-09T10:52:17Z
false
28.071672
44.353714
25.906758
36.111937
54.617206
0
false
aisquared_dlite-v2-124m_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/aisquared/dlite-v2-124m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aisquared/dlite-v2-124m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v2-124m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aisquared/dlite-v2-124m
bc719f990748ea72be4b6c270df34fc3d37291dc
28.297227
apache-2.0
6
0
true
true
true
true
2023-10-16T12:54:17Z
false
23.976109
31.099383
25.289744
38.984032
50.434096
0
false
aisquared_dlite-v2-1_5b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/aisquared/dlite-v2-1_5b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aisquared/dlite-v2-1_5b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v2-1_5b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aisquared/dlite-v2-1_5b
97440ff1b6ef749423758e3495cdce1b5e68ee92
34.200093
apache-2.0
13
5
true
true
true
true
2023-09-09T10:52:17Z
false
32.593857
53.98327
24.930597
38.769255
54.696133
0.227445
false
aisquared_dlite-v2-355m_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/aisquared/dlite-v2-355m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aisquared/dlite-v2-355m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v2-355m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aisquared/dlite-v2-355m
f51d310aebc16a9fe0d999d2a437b5faff635716
31.200456
apache-2.0
7
0
true
true
true
true
2023-09-09T10:52:17Z
false
28.327645
40.539733
26.773027
38.760439
52.801894
0
false
aisquared_dlite-v2-774m_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/aisquared/dlite-v2-774m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aisquared/dlite-v2-774m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_aisquared__dlite-v2-774m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aisquared/dlite-v2-774m
0ea894a33e491912cd1a65dde47b4af03f03c4f2
32.858776
apache-2.0
10
0
true
true
true
true
2023-09-09T10:52:17Z
false
30.119454
47.679745
25.369273
39.998391
53.985793
0
false
aivince_alpaca_mistral-7b-v0.2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/aivince/alpaca_mistral-7b-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aivince/alpaca_mistral-7b-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_aivince__alpaca_mistral-7b-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aivince/alpaca_mistral-7b-v0.2
a7f91c5db12f3baf8d4e0279dde5a2183ddb070c
60.414838
0
7
false
true
true
true
2024-03-24T06:49:48Z
false
60.921502
83.280223
61.820379
42.656084
79.163378
34.64746
false
ajibawa-2023_Code-13B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ajibawa-2023/Code-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ajibawa-2023/Code-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Code-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ajibawa-2023/Code-13B
91f5a6d5cdf93aeb86dd8965e195d51522957fc6
54.805698
cc-by-nc-nd-4.0
13
13
true
true
true
true
2023-12-08T19:10:11Z
false
57.337884
83.280223
53.16536
42.461563
73.55959
19.029568
false
ajibawa-2023_Code-290k-13B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ajibawa-2023/Code-290k-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ajibawa-2023/Code-290k-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Code-290k-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ajibawa-2023/Code-290k-13B
e2595df2aedc1decaf73d167ce0114e7a9cb2126
52.958661
cc-by-nc-nd-4.0
8
13
true
true
true
true
2024-01-16T18:18:08Z
false
56.05802
81.5475
51.987657
37.650862
72.691397
17.816528
false
ajibawa-2023_Code-290k-6.7B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ajibawa-2023/Code-290k-6.7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ajibawa-2023/Code-290k-6.7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Code-290k-6.7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ajibawa-2023/Code-290k-6.7B-Instruct
4ef569814773fac1700bfb8c563118d497af7b76
36.64428
other
6
6
true
true
true
true
2024-02-27T05:30:44Z
false
34.897611
51.991635
34.891897
41.953006
52.644041
3.487491
false
ajibawa-2023_Code-Llama-3-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ajibawa-2023/Code-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ajibawa-2023/Code-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Code-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ajibawa-2023/Code-Llama-3-8B
bc5ba67c353bd2e79bff9c881979fd5d1d07eb25
47.797409
llama3
23
8
true
true
true
true
2024-05-07T05:37:33Z
false
49.744027
72.774348
49.155639
44.668299
67.561168
2.88097
false
ajibawa-2023_Code-Mistral-7B_4bit
4bit
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ajibawa-2023/Code-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ajibawa-2023/Code-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Code-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ajibawa-2023/Code-Mistral-7B
ed3b9ad583910423a7b82e27274681e3865206f1
68.398656
apache-2.0
15
7
true
true
true
true
2024-03-25T04:41:53Z
false
63.566553
83.708425
63.377696
51.806585
81.21547
66.71721
false
ajibawa-2023_Code-Mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ajibawa-2023/Code-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ajibawa-2023/Code-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Code-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ajibawa-2023/Code-Mistral-7B
ed3b9ad583910423a7b82e27274681e3865206f1
69.973822
apache-2.0
15
7
true
true
true
true
2024-03-25T06:50:49Z
false
64.590444
85.291775
65.000109
54.63721
82.241515
68.08188
false
ajibawa-2023_General-Stories-Mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ajibawa-2023/General-Stories-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ajibawa-2023/General-Stories-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__General-Stories-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ajibawa-2023/General-Stories-Mistral-7B
7ca9755d8a5cd2c7bf6e382269fa277a5bc44cc0
68.968092
apache-2.0
4
7
true
true
true
true
2024-04-21T13:11:17Z
false
67.406143
83.250349
64.130148
58.366334
79.321231
61.334344
false
ajibawa-2023_OpenHermes-2.5-Code-290k-13B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ajibawa-2023/OpenHermes-2.5-Code-290k-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ajibawa-2023/OpenHermes-2.5-Code-290k-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__OpenHermes-2.5-Code-290k-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ajibawa-2023/OpenHermes-2.5-Code-290k-13B
5fe89b1eb555644dd8a658c74ea118620ba3fdc1
63.32884
apache-2.0
10
13
true
true
true
true
2024-03-01T12:28:31Z
false
57.337884
80.481976
56.52734
52.501681
74.822415
58.301744
false
ajibawa-2023_Python-Code-13B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ajibawa-2023/Python-Code-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ajibawa-2023/Python-Code-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Python-Code-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ajibawa-2023/Python-Code-13B
981454b6a2275f787592589609df7f2bf558706d
53.605967
cc-by-nc-nd-4.0
6
13
true
true
true
true
2023-11-15T13:55:56Z
false
58.788396
81.65704
54.778255
42.826268
74.033149
9.552691
false
ajibawa-2023_Python-Code-33B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ajibawa-2023/Python-Code-33B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ajibawa-2023/Python-Code-33B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Python-Code-33B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ajibawa-2023/Python-Code-33B
cf9a561b57145748455fd3e193d2b0e4ae0a0fce
55.055767
cc-by-nc-nd-4.0
8
33
true
true
true
true
2023-12-04T08:53:26Z
false
56.313993
81.009759
54.218265
44.39434
75.217048
19.181198
false
ajibawa-2023_Scarlett-Llama-3-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ajibawa-2023/Scarlett-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ajibawa-2023/Scarlett-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Scarlett-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ajibawa-2023/Scarlett-Llama-3-8B
c6bfb37be02b69c34d70faf32b2595ae20d402ea
65.762652
other
5
8
true
true
true
true
2024-04-22T11:39:03Z
false
62.627986
83.857797
66.455644
56.267511
78.058406
47.308567
false
ajibawa-2023_Scarlett-Llama-3-8B-v1.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ajibawa-2023/Scarlett-Llama-3-8B-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ajibawa-2023/Scarlett-Llama-3-8B-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Scarlett-Llama-3-8B-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ajibawa-2023/Scarlett-Llama-3-8B-v1.0
21167cb1cc000f274a6d22367d179d4bc076c6a2
64.924961
other
4
8
true
true
true
true
2024-04-29T12:42:09Z
false
62.116041
83.977295
66.36154
55.97978
77.900552
43.214556
false
ajibawa-2023_SlimOrca-13B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ajibawa-2023/SlimOrca-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ajibawa-2023/SlimOrca-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__SlimOrca-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ajibawa-2023/SlimOrca-13B
75427e93dc99a5e1d8b9aefa106ad36fc750b744
60.390634
cc-by-nc-nd-4.0
10
13
true
true
true
true
2023-11-27T19:17:16Z
false
60.153584
81.398128
57.041943
49.367855
74.427782
39.954511
false
ajibawa-2023_SlimOrca-Llama-3-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ajibawa-2023/SlimOrca-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ajibawa-2023/SlimOrca-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__SlimOrca-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ajibawa-2023/SlimOrca-Llama-3-8B
c02fe5008d9778aba41936510701fadd00e94a28
54.823371
apache-2.0
3
8
true
true
true
true
2024-05-27T04:47:25Z
false
50.767918
71.509659
53.116475
48.030825
68.745067
36.770281
false
ajibawa-2023_Uncensored-Frank-13B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ajibawa-2023/Uncensored-Frank-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ajibawa-2023/Uncensored-Frank-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Uncensored-Frank-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ajibawa-2023/Uncensored-Frank-13B
73a27445e5e5a72857626e551c70542ec607f60c
55.640155
cc-by-nc-nd-4.0
7
13
true
true
true
true
2023-10-16T13:19:55Z
false
61.604096
82.622983
54.551247
48.340341
74.743489
11.978772
false
ajibawa-2023_Uncensored-Frank-33B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ajibawa-2023/Uncensored-Frank-33B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ajibawa-2023/Uncensored-Frank-33B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Uncensored-Frank-33B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ajibawa-2023/Uncensored-Frank-33B
1c1f4e9256ac2be145a9106863ee9f2e9d701e74
58.376706
cc-by-nc-nd-4.0
7
33
true
true
true
true
2023-10-16T12:54:17Z
false
62.116041
83.300139
57.572435
54.033515
76.5588
16.679303
false
ajibawa-2023_Uncensored-Frank-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ajibawa-2023/Uncensored-Frank-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ajibawa-2023/Uncensored-Frank-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Uncensored-Frank-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ajibawa-2023/Uncensored-Frank-7B
65bbcb80158a6d2e133bba99a90142caf4e2e242
47.899032
cc-by-nc-nd-4.0
5
7
true
true
true
true
2023-10-16T12:48:18Z
false
54.266212
76.518622
37.501308
43.859589
70.244672
5.003791
false
ajibawa-2023_Uncensored-Frank-Llama-3-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ajibawa-2023/Uncensored-Frank-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ajibawa-2023/Uncensored-Frank-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Uncensored-Frank-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ajibawa-2023/Uncensored-Frank-Llama-3-8B
6ca3d5278fce2033dc8d2526a0ca72d345a5edbd
62.241772
llama3
11
8
true
true
true
true
2024-05-04T12:20:01Z
false
59.641638
80.163314
63.075399
52.750285
73.164957
44.655042
false
ajibawa-2023_Uncensored-Jordan-13B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ajibawa-2023/Uncensored-Jordan-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ajibawa-2023/Uncensored-Jordan-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Uncensored-Jordan-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ajibawa-2023/Uncensored-Jordan-13B
c56a396342133bbd75ab3f79622c85cb55be49a4
56.272655
cc-by-nc-nd-4.0
6
13
true
true
true
true
2023-11-15T13:57:18Z
false
57.423208
82.702649
55.746255
50.512465
76.164167
15.087187
false
ajibawa-2023_Uncensored-Jordan-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ajibawa-2023/Uncensored-Jordan-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ajibawa-2023/Uncensored-Jordan-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Uncensored-Jordan-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ajibawa-2023/Uncensored-Jordan-7B
96a9fbe5aaef8410a8d0dad25f3cc97b408c4efb
49.949568
cc-by-nc-nd-4.0
5
7
true
true
true
true
2023-11-15T13:57:47Z
false
51.279863
77.365067
45.694527
47.497547
71.112865
6.747536
false
ajibawa-2023_WikiHow-Mistral-Instruct-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ajibawa-2023/WikiHow-Mistral-Instruct-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ajibawa-2023/WikiHow-Mistral-Instruct-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__WikiHow-Mistral-Instruct-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ajibawa-2023/WikiHow-Mistral-Instruct-7B
4ad83e84cf315977c49c96e91dc28f09f86987f9
61.249136
apache-2.0
7
7
true
true
true
true
2024-03-26T05:16:23Z
false
60.921502
80.989843
58.574679
62.163635
74.822415
30.022745
false
ajibawa-2023_Young-Children-Storyteller-Mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ajibawa-2023/Young-Children-Storyteller-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ajibawa-2023/Young-Children-Storyteller-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__Young-Children-Storyteller-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ajibawa-2023/Young-Children-Storyteller-Mistral-7B
7316729c96d4862ad80df677e1716739aee43c91
71.083832
apache-2.0
7
7
true
true
true
true
2024-04-04T12:56:34Z
false
68.686007
84.674368
64.105863
62.620375
81.21547
65.20091
false
ajibawa-2023_carl-33b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ajibawa-2023/carl-33b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ajibawa-2023/carl-33b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__carl-33b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ajibawa-2023/carl-33b
5f80b372b493d901cab4490b4f23c71499023615
56.029112
cc-by-nc-nd-4.0
10
33
true
true
true
true
2023-10-16T12:48:18Z
false
64.590444
85.271858
58.378562
45.322253
76.243094
6.368461
false
ajibawa-2023_carl-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ajibawa-2023/carl-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ajibawa-2023/carl-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__carl-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ajibawa-2023/carl-7b
de4c7af9598bebc47dd43253c972be719f3195d6
46.163807
cc-by-nc-nd-4.0
5
7
true
true
true
true
2023-10-16T12:54:17Z
false
53.498294
78.291177
33.963183
40.29271
68.587214
2.350265
false
ajibawa-2023_scarlett-33b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ajibawa-2023/scarlett-33b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ajibawa-2023/scarlett-33b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__scarlett-33b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ajibawa-2023/scarlett-33b
305eea72fb9fe2ac5929a62483ea51f152bcc060
58.810627
cc-by-nc-nd-4.0
24
33
true
true
true
true
2023-09-09T10:52:17Z
false
67.74744
85.48098
58.979913
61.054694
76.79558
2.805155
false
ajibawa-2023_scarlett-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ajibawa-2023/scarlett-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ajibawa-2023/scarlett-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ajibawa-2023__scarlett-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ajibawa-2023/scarlett-7b
0715b738e750830ba7213f26fe32fa1cc1bb15b3
49.085398
other
4
7
true
true
true
true
2023-10-16T13:00:29Z
false
57.167235
80.272854
36.110698
48.519432
72.138911
0.30326
false
akjindal53244_Mistral-7B-v0.1-Open-Platypus_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/akjindal53244/Mistral-7B-v0.1-Open-Platypus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">akjindal53244/Mistral-7B-v0.1-Open-Platypus</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_akjindal53244__Mistral-7B-v0.1-Open-Platypus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
akjindal53244/Mistral-7B-v0.1-Open-Platypus
aa2c84e89c4c8a10e0569e45021b59e6d1c08bda
58.920617
apache-2.0
8
7
true
true
true
true
2023-10-16T12:48:18Z
false
62.372014
85.082653
63.790977
47.328463
77.663773
17.285823
false
alchemonaut_BoreanGale-70B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/alchemonaut/BoreanGale-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">alchemonaut/BoreanGale-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_alchemonaut__BoreanGale-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
alchemonaut/BoreanGale-70B
f7768207c1f37d3f4374dccc182d7a86c6539ead
76.483602
other
4
68
true
false
true
true
2024-02-02T17:14:47Z
false
73.890785
89.374627
75.185904
68.596182
84.530387
67.32373
false
alchemonaut_QuartetAnemoi-70B-t0.0001_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/alchemonaut/QuartetAnemoi-70B-t0.0001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">alchemonaut/QuartetAnemoi-70B-t0.0001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_alchemonaut__QuartetAnemoi-70B-t0.0001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
alchemonaut/QuartetAnemoi-70B-t0.0001
392d963e63267650f2aea7dc26c60ee6fd2b26d4
76.859252
other
30
68
true
false
true
true
2024-02-04T03:32:46Z
false
73.37884
88.896634
75.415558
69.532241
85.319653
68.612585
false
alexredna_TinyLlama-1.1B-Chat-v1.0-reasoning-v2-dpo_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/alexredna/TinyLlama-1.1B-Chat-v1.0-reasoning-v2-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">alexredna/TinyLlama-1.1B-Chat-v1.0-reasoning-v2-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_alexredna__TinyLlama-1.1B-Chat-v1.0-reasoning-v2-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
alexredna/TinyLlama-1.1B-Chat-v1.0-reasoning-v2-dpo
f61da97b0c79b404f3dbe88f9379d1c918777338
37.029043
apache-2.0
2
1
true
true
true
true
2024-01-08T09:45:42Z
false
34.385666
61.870145
26.335379
36.126081
63.456985
0
false
alexredna_Tukan-1.1B-Chat-reasoning-sft-COLA_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/alexredna/Tukan-1.1B-Chat-reasoning-sft-COLA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">alexredna/Tukan-1.1B-Chat-reasoning-sft-COLA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_alexredna__Tukan-1.1B-Chat-reasoning-sft-COLA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
alexredna/Tukan-1.1B-Chat-reasoning-sft-COLA
fa129eb7563bc1f8234dc372d6255bec3c3b4143
36.526945
apache-2.0
0
1
true
true
true
true
2024-01-22T22:37:07Z
false
34.129693
59.778929
24.86015
38.254749
60.773481
1.36467
false
alibidaran_medical_transcription_generator_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/alibidaran/medical_transcription_generator" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">alibidaran/medical_transcription_generator</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_alibidaran__medical_transcription_generator" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
alibidaran/medical_transcription_generator
f622239151c89c2db0f1cef495d1b42afd16ce64
29.025059
2
0
false
true
true
true
2023-10-16T12:48:18Z
false
22.78157
30.601474
23.836501
46.496713
50.434096
0
false
alignment-handbook_zephyr-7b-dpo-full_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/alignment-handbook/zephyr-7b-dpo-full" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">alignment-handbook/zephyr-7b-dpo-full</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_alignment-handbook__zephyr-7b-dpo-full" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
alignment-handbook/zephyr-7b-dpo-full
8e0975a3a9f7a25a47b5f6f8e2601365829ecfad
58.251923
apache-2.0
2
7
true
true
true
true
2024-04-08T00:25:03Z
false
62.883959
84.44533
59.561964
47.407882
76.637727
18.574678
false
alignment-handbook_zephyr-7b-dpo-qlora_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/alignment-handbook/zephyr-7b-dpo-qlora" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">alignment-handbook/zephyr-7b-dpo-qlora</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_alignment-handbook__zephyr-7b-dpo-qlora" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
alignment-handbook/zephyr-7b-dpo-qlora
b991e934e478e9b406d07840940e9a785a62f0ba
63.508719
apache-2.0
7
7
true
true
true
true
2024-01-26T19:51:12Z
false
63.651877
85.351524
63.821139
47.144915
79.005525
42.077331
false
alignment-handbook_zephyr-7b-sft-full_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/alignment-handbook/zephyr-7b-sft-full" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">alignment-handbook/zephyr-7b-sft-full</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_alignment-handbook__zephyr-7b-sft-full" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
alignment-handbook/zephyr-7b-sft-full
92f9fac4529acacb2c33a35c46917393690c6311
57.515357
apache-2.0
20
7
true
true
true
true
2024-01-16T02:38:43Z
false
58.105802
80.830512
60.19571
41.741282
76.243094
27.975739
false
alignment-handbook_zephyr-7b-sft-full_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/alignment-handbook/zephyr-7b-sft-full" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">alignment-handbook/zephyr-7b-sft-full</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_alignment-handbook__zephyr-7b-sft-full" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
alignment-handbook/zephyr-7b-sft-full
92f9fac4529acacb2c33a35c46917393690c6311
57.556885
apache-2.0
20
7
true
true
true
true
2024-01-16T02:43:13Z
false
57.679181
80.820554
60.314194
41.708251
76.085241
28.733889
false
alignment-handbook_zephyr-7b-sft-qlora_4bit
4bit
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/alignment-handbook/zephyr-7b-sft-qlora" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">alignment-handbook/zephyr-7b-sft-qlora</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_alignment-handbook__zephyr-7b-sft-qlora" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
alignment-handbook/zephyr-7b-sft-qlora
156bec577ff12a65236cfc90860dcc61e96c6fd6
59.003694
apache-2.0
6
7
true
true
true
true
2024-01-26T19:49:52Z
false
60.068259
82.364071
61.650513
38.875355
76.79558
34.268385
false
allbyai_ToRoLaMa-7b-v1.0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/allbyai/ToRoLaMa-7b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allbyai/ToRoLaMa-7b-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_allbyai__ToRoLaMa-7b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
allbyai/ToRoLaMa-7b-v1.0
9dd9ebe69ae8b391722c4edbfe70bd6c59b3b14d
47.868523
llama2
8
7
true
true
true
true
2024-01-05T08:45:36Z
false
51.706485
73.819956
45.338755
44.894455
70.086819
1.36467
false
allenai_OLMo-1.7-7B-hf_bfloat16
bfloat16
🟢 pretrained
🟢
Original
OlmoForCausalLM
<a target="_blank" href="https://huggingface.co/allenai/OLMo-1.7-7B-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/OLMo-1.7-7B-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_allenai__OLMo-1.7-7B-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
allenai/OLMo-1.7-7B-hf
f5ad8164968910b1bf78635a9171b7ce410224c3
52.823063
apache-2.0
8
6
true
true
true
true
2024-04-22T15:40:36Z
false
49.40273
78.679546
53.517483
35.893859
72.454617
26.990144
false
allenai_OLMo-1.7-7B-hf_float16
float16
🟢 pretrained
🟢
Original
OlmoForCausalLM
<a target="_blank" href="https://huggingface.co/allenai/OLMo-1.7-7B-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/OLMo-1.7-7B-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_allenai__OLMo-1.7-7B-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
allenai/OLMo-1.7-7B-hf
f5ad8164968910b1bf78635a9171b7ce410224c3
52.789003
apache-2.0
8
6
true
true
true
true
2024-04-22T15:40:19Z
false
49.40273
78.649671
53.385351
35.91487
72.770324
26.611069
false
allenai_OLMo-1B-hf_float16
float16
🟢 pretrained
🟢
Original
OLMoForCausalLM
<a target="_blank" href="https://huggingface.co/allenai/OLMo-1B-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/OLMo-1B-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_allenai__OLMo-1B-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
allenai/OLMo-1B-hf
e005c9593b7109908ea0ba7eca5a79648259b7cb
36.781937
apache-2.0
10
1
true
true
true
true
2024-04-22T23:00:19Z
false
34.726962
63.642701
26.234232
32.945313
61.24704
1.895375
false
allenai_OLMo-1B-hf_bfloat16
bfloat16
🟢 pretrained
🟢
Original
OLMoForCausalLM
<a target="_blank" href="https://huggingface.co/allenai/OLMo-1B-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/OLMo-1B-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_allenai__OLMo-1B-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
allenai/OLMo-1B-hf
e005c9593b7109908ea0ba7eca5a79648259b7cb
36.727782
apache-2.0
10
1
true
true
true
true
2024-04-22T23:00:43Z
false
34.556314
63.602868
26.306692
32.916254
61.089187
1.895375
false
allenai_OLMo-7B-hf_bfloat16
bfloat16
🟢 pretrained
🟢
Original
OLMoForCausalLM
<a target="_blank" href="https://huggingface.co/allenai/OLMo-7B-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/OLMo-7B-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_allenai__OLMo-7B-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
allenai/OLMo-7B-hf
60c4e0ab1860c15d2616509e68b36c7a541c584c
43.363294
apache-2.0
8
6
true
true
true
true
2024-04-21T19:35:45Z
false
45.648464
77.305318
28.132272
35.926483
69.37648
3.790751
false
allenai_digital-socrates-13b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/allenai/digital-socrates-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/digital-socrates-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_allenai__digital-socrates-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
allenai/digital-socrates-13b
c738ee4bb61e67eebb9d196c440dcb2d99e5f906
57.342382
apache-2.0
9
13
true
true
true
true
2023-11-23T06:38:16Z
false
58.361775
80.143398
57.005538
44.465904
74.585635
29.492039
false
allenai_digital-socrates-7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/allenai/digital-socrates-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/digital-socrates-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_allenai__digital-socrates-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
allenai/digital-socrates-7b
5d26db18b95778c31dc8425871052f495b267563
52.94925
apache-2.0
6
7
true
true
true
true
2023-11-23T06:39:35Z
false
54.43686
75.990838
51.41243
44.876999
73.08603
17.892343
false
allenai_tulu-2-dpo-70b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/allenai/tulu-2-dpo-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/tulu-2-dpo-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_allenai__tulu-2-dpo-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
allenai/tulu-2-dpo-70b
0ab5c875f0070d5aee8d36bc55f41de440a13f02
73.765007
other
146
68
true
true
true
true
2024-02-01T13:38:05Z
false
72.098976
88.986258
69.83749
65.776557
83.267561
62.623199
false
allknowingroger_ANIMA-biodesign-7B-slerp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/allknowingroger/ANIMA-biodesign-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/ANIMA-biodesign-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__ANIMA-biodesign-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
allknowingroger/ANIMA-biodesign-7B-slerp
a689d19578291d22aeb7fc3583837829f90b22b6
60.780608
apache-2.0
1
7
true
false
true
true
2024-04-10T18:42:34Z
false
60.068259
81.27863
58.536061
54.250742
75.295975
35.25398
false
allknowingroger_AutoLimmy-7B-slerp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/allknowingroger/AutoLimmy-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/AutoLimmy-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__AutoLimmy-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
allknowingroger/AutoLimmy-7B-slerp
cc0c775bd9ae1c1bcf174965c2133973c32a98d0
76.611575
apache-2.0
0
7
true
false
true
true
2024-04-10T18:17:14Z
false
72.952218
89.125672
64.262721
78.123304
84.92502
70.280516
false
allknowingroger_CalmExperiment-7B-slerp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/allknowingroger/CalmExperiment-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/CalmExperiment-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__CalmExperiment-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
allknowingroger/CalmExperiment-7B-slerp
1c1ea1f8fd123ba5cba970a1cfce97ed57542fd4
76.672894
apache-2.0
0
7
true
false
true
true
2024-04-10T18:20:36Z
false
73.37884
89.085839
64.350198
77.931799
85.1618
70.128886
false
allknowingroger_Calmesmol-7B-slerp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/allknowingroger/Calmesmol-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Calmesmol-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__Calmesmol-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
allknowingroger/Calmesmol-7B-slerp
0c3cb00f5ec14b4f4f1d2a2a9bcf8db2c2732dd1
72.811725
apache-2.0
0
7
true
false
true
true
2024-04-10T18:21:49Z
false
68.686007
86.84525
65.022744
62.82428
81.846882
71.645186
false
allknowingroger_Calmex26merge-12B-MoE_float16
float16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/allknowingroger/Calmex26merge-12B-MoE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Calmex26merge-12B-MoE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__Calmex26merge-12B-MoE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
allknowingroger/Calmex26merge-12B-MoE
16a9567ef1136531ff32c089b518a5019a810e5b
76.603645
apache-2.0
0
12
true
false
false
true
2024-04-24T08:01:55Z
false
72.866894
89.13563
64.392178
77.945815
84.92502
70.356331
false
allknowingroger_CeptrixBeagle-12B-MoE_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/allknowingroger/CeptrixBeagle-12B-MoE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/CeptrixBeagle-12B-MoE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__CeptrixBeagle-12B-MoE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
allknowingroger/CeptrixBeagle-12B-MoE
70387f29ef0f76b555db635277c0117d7cf3eedb
76.391417
apache-2.0
1
12
true
false
false
true
2024-05-04T17:25:51Z
false
72.78157
89.24517
64.179221
78.447083
85.082873
68.612585
false
allknowingroger_DolphinChat-7B-slerp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/allknowingroger/DolphinChat-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/DolphinChat-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__DolphinChat-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
allknowingroger/DolphinChat-7B-slerp
b9252efec1aefbe9c727aa851ea226643f7f4358
67.916892
apache-2.0
0
7
true
false
true
true
2024-04-10T18:28:52Z
false
64.590444
84.206333
64.231981
50.85515
81.373323
62.244124
false
allknowingroger_Experimentmultiverse-7B-slerp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/allknowingroger/Experimentmultiverse-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Experimentmultiverse-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__Experimentmultiverse-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
allknowingroger/Experimentmultiverse-7B-slerp
b4662683ca8e1b07ff6a92478614b51f7e049a55
76.637689
apache-2.0
0
7
true
false
true
true
2024-04-10T18:50:43Z
false
73.037543
89.105756
64.351619
78.201494
84.92502
70.204701
false
allknowingroger_FrankenLimmy-10B-passthrough_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/allknowingroger/FrankenLimmy-10B-passthrough" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/FrankenLimmy-10B-passthrough</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__FrankenLimmy-10B-passthrough" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
allknowingroger/FrankenLimmy-10B-passthrough
2f62333f11187a1b81f81bdd38688836bc559713
72.255989
apache-2.0
0
10
true
false
true
true
2024-04-10T19:05:05Z
false
71.672355
88.637722
63.909617
73.790355
83.820047
51.705838
false
allknowingroger_FrankenLong-15B-passthrough_float16
float16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/allknowingroger/FrankenLong-15B-passthrough" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/FrankenLong-15B-passthrough</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__FrankenLong-15B-passthrough" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
allknowingroger/FrankenLong-15B-passthrough
ce0caf1e27c13c86a26ca45851b7dc6fda992254
29.58503
0
17
false
true
true
true
2024-04-10T19:03:52Z
false
29.095563
26.050588
24.555808
48.95266
48.855564
0
false