eval_name
stringlengths
9
97
Precision
stringclasses
5 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
3 values
Architecture
stringclasses
53 values
Model
stringlengths
355
611
fullname
stringlengths
4
89
Model sha
stringlengths
0
40
Average ⬆️
float64
27
81.3
Hub License
stringclasses
35 values
Hub ❤️
int64
0
4.88k
#Params (B)
int64
0
238
Available on the hub
bool
2 classes
Merged
bool
2 classes
MoE
bool
2 classes
Flagged
bool
1 class
date
stringlengths
0
26
Chat Template
bool
2 classes
ARC
float64
19.7
87.5
HellaSwag
float64
20.7
92.8
MMLU
float64
17.8
89.4
TruthfulQA
float64
27.9
82.3
Winogrande
float64
47.2
91.5
GSM8K
float64
0
88.2
Maintainers Choice
bool
2 classes
alykassem_ds_diasum_md_mixtral_4bit
4bit
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
Unknown
<a target="_blank" href="https://huggingface.co/alykassem/ds_diasum_md_mixtral" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">alykassem/ds_diasum_md_mixtral</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_alykassem__ds_diasum_md_mixtral" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
alykassem/ds_diasum_md_mixtral
8ee85e4555b4c4a75b29ee749a86c97e0d37d242
68.42465
openrail
0
0
true
true
true
true
2023-12-25T16:50:51Z
false
66.296928
85.451105
69.508812
55.721641
80.347277
53.222138
false
amazingvince_Not-WizardLM-2-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/amazingvince/Not-WizardLM-2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">amazingvince/Not-WizardLM-2-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_amazingvince__Not-WizardLM-2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
amazingvince/Not-WizardLM-2-7B
85e18c22b30a3f3ae4590c28d59eaca5dcc8c795
63.660212
apache-2.0
88
7
true
true
true
true
2024-04-16T04:13:32Z
false
62.883959
83.260307
61.533749
56.978409
73.55959
43.745262
false
amazingvince_openhermes-7b-dpo_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/amazingvince/openhermes-7b-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">amazingvince/openhermes-7b-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_amazingvince__openhermes-7b-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
amazingvince/openhermes-7b-dpo
c191ac2d33de8bb5f1454e95c50fab40dc52974e
65.13839
apache-2.0
2
7
true
true
true
true
2024-01-27T21:48:39Z
false
65.784983
84.943238
63.656788
57.013708
77.505919
41.925701
false
amazingvince_where-llambo-7b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/amazingvince/where-llambo-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">amazingvince/where-llambo-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_amazingvince__where-llambo-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
amazingvince/where-llambo-7b
554d9c7bab7ea6deabef0266aef17aa98f758543
66.07654
apache-2.0
0
7
true
true
true
true
2023-12-08T19:05:32Z
false
58.447099
82.055367
62.6117
49.612201
78.531965
65.20091
false
amazingvince_zephyr-smol_llama-100m-dpo-full_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/amazingvince/zephyr-smol_llama-100m-dpo-full" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">amazingvince/zephyr-smol_llama-100m-dpo-full</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_amazingvince__zephyr-smol_llama-100m-dpo-full" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
amazingvince/zephyr-smol_llama-100m-dpo-full
be3400c89d66ed66f0aa96f1b8131604c118b67b
29.369352
apache-2.0
3
0
true
true
true
true
2023-11-20T22:34:40Z
false
25
28.540131
25.180638
45.7475
51.065509
0.682335
false
amazon_LightGPT_float16
float16
?
Original
GPTJForCausalLM
<a target="_blank" href="https://huggingface.co/amazon/LightGPT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">amazon/LightGPT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_amazon__LightGPT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
amazon/LightGPT
1f6ffd8f162030396a3bc1ca2e3504896dbe6434
39.540473
0
0
false
true
true
true
2023-09-09T10:52:17Z
false
39.931741
63.821948
28.447472
36.692081
64.483031
3.866566
false
amazon_MistralLite_float16
float16
?
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/amazon/MistralLite" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">amazon/MistralLite</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_amazon__MistralLite" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
amazon/MistralLite
23486089ab7ba741b34adc69ab7555885f8abe71
51.447262
apache-2.0
426
0
true
true
true
true
2023-11-27T14:29:22Z
false
59.556314
81.836288
50.932987
37.869582
77.426993
1.06141
false
ammarali32_MultiVerse_LASER_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ammarali32/MultiVerse_LASER" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ammarali32/MultiVerse_LASER</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ammarali32__MultiVerse_LASER" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ammarali32/MultiVerse_LASER
7385d3aa94cebfb10f983bc905fea3e83c4a4e3c
76.332375
0
7
false
true
true
true
2024-03-13T15:41:39Z
false
72.525597
88.807011
64.516327
77.697931
84.92502
69.522365
false
ammarali32_multi_verse_model_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ammarali32/multi_verse_model" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ammarali32/multi_verse_model</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ammarali32__multi_verse_model" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ammarali32/multi_verse_model
e2aa6fdad0b28a6019b0fc7c178a3579c3d671e8
76.735609
0
7
false
true
true
true
2024-03-07T07:39:34Z
false
72.866894
89.195379
64.400459
77.917644
84.767167
71.266111
false
amu_dpo-Qwen1.5-0.5B-Chat_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/amu/dpo-Qwen1.5-0.5B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">amu/dpo-Qwen1.5-0.5B-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_amu__dpo-Qwen1.5-0.5B-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
amu/dpo-Qwen1.5-0.5B-Chat
dc2362b0dc1f77817039ff61c1a36e27a4c4c009
33.470303
0
0
true
true
true
true
2024-05-06T13:27:03Z
false
29.607509
42.710615
30.644507
41.226092
53.82794
2.805155
false
amu_dpo-phi2_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/amu/dpo-phi2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">amu/dpo-phi2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_amu__dpo-phi2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
amu/dpo-phi2
46d19a6f4e37644a426b0a6917959cf4bb388ef1
61.256762
0
2
false
true
true
true
2024-02-09T16:15:34Z
false
61.68942
75.134435
58.10178
43.988755
74.191002
54.435178
false
amu_orpo-lora-phi2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/amu/orpo-lora-phi2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">amu/orpo-lora-phi2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_amu__orpo-lora-phi2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
amu/orpo-lora-phi2
646be9d724c5c041121426babe71c02b12d8ba31
60.616376
0
2
false
true
true
true
2024-03-31T07:36:12Z
false
60.324232
74.576778
58.119648
44.496169
73.717443
52.463988
false
amu_orpo-phi2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/amu/orpo-phi2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">amu/orpo-phi2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_amu__orpo-phi2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
amu/orpo-phi2
97dae7d50dd98ed0eeb63afcd218a11b1e172ecb
34.190592
0
2
false
true
true
true
2024-04-01T15:55:59Z
false
31.228669
41.515634
28.899565
47.61965
55.880032
0
false
amu_r-zephyr-7b-beta-qlora_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/amu/r-zephyr-7b-beta-qlora" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">amu/r-zephyr-7b-beta-qlora</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_amu__r-zephyr-7b-beta-qlora" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
amu/r-zephyr-7b-beta-qlora
3948f437f08ebb9f0bc7da37cdead0cc3dd7a562
62.704264
0
7
false
true
true
true
2024-02-18T12:56:17Z
false
63.054608
85.381398
63.103018
46.320599
79.321231
39.044731
false
amu_spin-phi2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/amu/spin-phi2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">amu/spin-phi2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_amu__spin-phi2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
amu/spin-phi2
b206227dcf0c36eb30edcee377e5b0ccdd3668c9
61.669797
0
2
false
true
true
true
2024-03-03T05:56:29Z
false
63.139932
75.562637
57.075917
45.765746
74.191002
54.283548
false
amu_spin-phi2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/amu/spin-phi2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">amu/spin-phi2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_amu__spin-phi2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
amu/spin-phi2
5040b8b4108f00030839472e5c97d7c5944904e7
61.67656
0
2
false
true
true
true
2024-03-03T08:09:30Z
false
63.566553
75.572595
57.926239
46.215358
73.480663
53.297953
false
amu_spin-phi2-1.5_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/amu/spin-phi2-1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">amu/spin-phi2-1.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_amu__spin-phi2-1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
amu/spin-phi2-1.5
5c9c6b9819b1a1631ac4d6db1e93b011a318756c
61.105156
0
2
false
true
true
true
2024-03-03T08:46:01Z
false
63.651877
75.791675
56.524544
46.398566
73.164957
51.099318
false
amu_spin-phi2-2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/amu/spin-phi2-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">amu/spin-phi2-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_amu__spin-phi2-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
amu/spin-phi2-2
5c9c6b9819b1a1631ac4d6db1e93b011a318756c
61.105156
0
2
false
true
true
true
2024-03-03T08:46:32Z
false
63.651877
75.791675
56.524544
46.398566
73.164957
51.099318
false
amu_zen_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Delta
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/amu/zen" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">amu/zen</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_amu__zen" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
amu/zen
2d41f336037eadddf1dcd75d622813ab8e956067
null
0
7
false
true
true
true
2024-01-07T11:21:59Z
false
23.976109
25.084644
23.256976
null
49.64483
0
false
amu_zen_moe_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/amu/zen_moe" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">amu/zen_moe</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_amu__zen_moe" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
amu/zen_moe
5e6e23c4da1c3b6049a42d755cdf74848efd454a
68.344431
0
12
false
true
false
true
2024-01-15T16:24:54Z
false
63.822526
85.052778
64.753613
50.027512
81.057616
65.35254
false
anakin87_gemma-2b-orpo_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/anakin87/gemma-2b-orpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anakin87/gemma-2b-orpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_anakin87__gemma-2b-orpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
anakin87/gemma-2b-orpo
76e5b9ca4e8a328b550b7099f77b2fc2732d71a6
47.353991
other
27
2
true
true
true
true
2024-05-06T13:47:05Z
false
49.146758
73.720374
38.522806
44.534683
64.325178
13.874147
false
anakin87_gemma-3.5b-orpo-selfmerge_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/anakin87/gemma-3.5b-orpo-selfmerge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anakin87/gemma-3.5b-orpo-selfmerge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_anakin87__gemma-3.5b-orpo-selfmerge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
anakin87/gemma-3.5b-orpo-selfmerge
ef6b7e692aad4111e794e1678ddb4eaa9b3cb02b
41.861616
0
3
false
true
true
true
2024-05-07T09:43:50Z
false
43.686007
64.528978
34.473493
44.961702
59.273875
4.245641
false
anas-awadalla_mpt-1b-redpajama-200b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MosaicGPT
<a target="_blank" href="https://huggingface.co/anas-awadalla/mpt-1b-redpajama-200b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anas-awadalla/mpt-1b-redpajama-200b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_anas-awadalla__mpt-1b-redpajama-200b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
anas-awadalla/mpt-1b-redpajama-200b
fc98636655efb7c091bbe5d8014eb138ddfc5471
29.045008
apache-2.0
2
1
true
true
true
true
2023-11-06T10:31:15Z
false
25.767918
26.080462
24.497332
47.569169
50.35517
0
false
anas-awadalla_mpt-7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MPTForCausalLM
<a target="_blank" href="https://huggingface.co/anas-awadalla/mpt-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anas-awadalla/mpt-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_anas-awadalla__mpt-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
anas-awadalla/mpt-7b
b772e556c8e8a17d087db6935e7cd019e5eefb0f
44.276598
1
7
false
true
true
true
2023-10-16T12:46:18Z
false
47.696246
77.574188
30.795325
33.436722
72.138911
4.018196
false
andreaskoepf_llama2-13b-megacode2_min100_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/andreaskoepf/llama2-13b-megacode2_min100" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">andreaskoepf/llama2-13b-megacode2_min100</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_andreaskoepf__llama2-13b-megacode2_min100" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
andreaskoepf/llama2-13b-megacode2_min100
b38d1b53c358a0313c69bcceebe97628327ada82
56.920609
other
1
13
false
true
true
true
2023-10-16T12:48:18Z
false
60.580205
81.258713
57.916388
48.893759
76.953433
15.921152
false
andrijdavid_Macaroni-7b-Tied_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/andrijdavid/Macaroni-7b-Tied" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">andrijdavid/Macaroni-7b-Tied</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_andrijdavid__Macaroni-7b-Tied" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
andrijdavid/Macaroni-7b-Tied
6323cf53ed75eab25ca37b3636a0f38ee8d1ac30
74.96255
apache-2.0
1
7
true
false
true
true
2024-01-20T11:44:11Z
false
72.866894
88.139813
64.732858
70.540553
81.925809
71.569371
false
andrijdavid_Macaroni-v2-7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/andrijdavid/Macaroni-v2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">andrijdavid/Macaroni-v2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_andrijdavid__Macaroni-v2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
andrijdavid/Macaroni-v2-7b
b611850983ecc381c68b4853b1e2aa570ce22330
62.052962
apache-2.0
0
7
true
false
true
true
2024-02-07T13:23:38Z
false
67.150171
83.837881
61.28524
67.067213
79.558011
13.419257
false
andrijdavid_Meta-Llama-3-13B-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/andrijdavid/Meta-Llama-3-13B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">andrijdavid/Meta-Llama-3-13B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_andrijdavid__Meta-Llama-3-13B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
andrijdavid/Meta-Llama-3-13B-Instruct
474e750f496928b85ffe03a21fe1eb4c0c1540af
52.27113
other
0
13
true
false
true
true
2024-05-07T21:42:37Z
false
53.156997
69.069906
59.082532
50.070982
68.90292
13.343442
false
andrijdavid_macaroni-7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/andrijdavid/macaroni-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">andrijdavid/macaroni-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_andrijdavid__macaroni-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
andrijdavid/macaroni-7b
e1c0fe26554eb627aed9569f106e838f0333850f
74.603893
apache-2.0
1
7
true
false
true
true
2024-01-19T18:16:25Z
false
73.122867
88.169687
64.582492
68.763195
84.372534
68.612585
false
andrijdavid_tinyllama-dare_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/andrijdavid/tinyllama-dare" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">andrijdavid/tinyllama-dare</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_andrijdavid__tinyllama-dare" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
andrijdavid/tinyllama-dare
f3c5e1369064d3167377b6965a74637d26102e6b
38.64078
apache-2.0
1
1
true
false
true
true
2024-01-19T18:32:32Z
false
37.286689
62.776339
25.198735
39.011276
65.90371
1.66793
false
andysalerno_cloudymixtral7Bx2-nectar-0.2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/andysalerno/cloudymixtral7Bx2-nectar-0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">andysalerno/cloudymixtral7Bx2-nectar-0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__cloudymixtral7Bx2-nectar-0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
andysalerno/cloudymixtral7Bx2-nectar-0.2
56b640240f1aca4a91ccf66de041c86102dfe2c9
59.541964
0
12
false
true
true
true
2024-01-21T23:57:44Z
false
67.491468
80.830512
65.142522
68.698944
73.875296
1.21304
false
andysalerno_cloudymixtral7Bx2-nectar-0.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/andysalerno/cloudymixtral7Bx2-nectar-0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">andysalerno/cloudymixtral7Bx2-nectar-0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__cloudymixtral7Bx2-nectar-0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
andysalerno/cloudymixtral7Bx2-nectar-0.2
56b640240f1aca4a91ccf66de041c86102dfe2c9
59.529751
0
12
false
true
true
true
2024-01-21T23:59:40Z
false
67.491468
80.770763
65.091902
68.732926
73.954223
1.137225
false
andysalerno_mistral-sft-v3_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/andysalerno/mistral-sft-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">andysalerno/mistral-sft-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__mistral-sft-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
andysalerno/mistral-sft-v3
48beb1e9490732abc6f85d92579d407d85e2cf5d
60.930438
apache-2.0
3
7
true
true
true
true
2024-01-31T03:45:42Z
false
61.348123
82.234615
63.400322
48.486974
77.663773
32.448825
false
andysalerno_openchat-nectar-0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/andysalerno/openchat-nectar-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">andysalerno/openchat-nectar-0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
andysalerno/openchat-nectar-0.1
cf913c9f807a9bdbe606ac4bf445d93a082a118c
69.940153
apache-2.0
1
7
true
true
true
true
2024-01-11T08:30:26Z
false
66.211604
82.991436
65.174313
54.216246
81.373323
69.673995
false
andysalerno_openchat-nectar-0.11_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/andysalerno/openchat-nectar-0.11" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">andysalerno/openchat-nectar-0.11</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.11" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
andysalerno/openchat-nectar-0.11
311304dd45050345aea499c85ddd3af89411513d
69.594942
0
7
true
true
true
true
2024-01-21T16:11:40Z
false
66.211604
83.280223
65.247875
52.916742
81.452249
68.460955
false
andysalerno_openchat-nectar-0.14_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/andysalerno/openchat-nectar-0.14" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">andysalerno/openchat-nectar-0.14</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.14" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
andysalerno/openchat-nectar-0.14
6a3412e4ece04c794bef9d90e38a6dcb6ad07f70
69.089076
apache-2.0
0
7
true
true
true
true
2024-01-23T21:43:03Z
false
65.614334
83.02131
64.582923
50.092047
82.004736
69.219105
false
andysalerno_openchat-nectar-0.3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/andysalerno/openchat-nectar-0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">andysalerno/openchat-nectar-0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
andysalerno/openchat-nectar-0.3
cc29b95f9d0bee765206b07e4d9bba05a0fcafb2
69.364381
0
7
true
true
true
true
2024-01-12T19:46:19Z
false
65.955631
83.150767
65.462849
52.383058
81.531176
67.702805
false
andysalerno_openchat-nectar-0.4_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/andysalerno/openchat-nectar-0.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">andysalerno/openchat-nectar-0.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
andysalerno/openchat-nectar-0.4
25eaf0bb01b56d1ce515dd1aa972be468e04c3ed
69.515892
0
7
true
true
true
true
2024-01-12T23:14:11Z
false
66.638225
83.230432
65.218523
51.706555
81.689029
68.612585
false
andysalerno_openchat-nectar-0.5_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/andysalerno/openchat-nectar-0.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">andysalerno/openchat-nectar-0.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
andysalerno/openchat-nectar-0.5
ba3caf530cbd9caf5c7cc7639cc47a910ed2a120
69.667361
apache-2.0
2
7
true
true
true
true
2024-01-14T13:22:27Z
false
66.723549
83.529177
65.357449
52.152633
82.083662
68.157695
false
andysalerno_openchat-nectar-0.6_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/andysalerno/openchat-nectar-0.6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">andysalerno/openchat-nectar-0.6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
andysalerno/openchat-nectar-0.6
502b55ebd1ca3c159591a9d7e9d9a456ac067e8d
69.639444
0
7
true
true
true
true
2024-01-16T21:17:09Z
false
66.552901
83.220474
65.193672
51.904338
81.21547
69.74981
false
andysalerno_openchat-nectar-0.7_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/andysalerno/openchat-nectar-0.7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">andysalerno/openchat-nectar-0.7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
andysalerno/openchat-nectar-0.7
082de25a339e1e8e5a64c9fc84429f1a4a0847ac
69.192778
0
7
true
true
true
true
2024-01-21T03:06:41Z
false
65.784983
83.001394
65.097329
52.045203
81.373323
67.854435
false
andysalerno_openchat-nectar-0.8_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/andysalerno/openchat-nectar-0.8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">andysalerno/openchat-nectar-0.8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__openchat-nectar-0.8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
andysalerno/openchat-nectar-0.8
639db94ac706d6964a3eed642b8de3a582bbffa8
69.26216
0
7
true
true
true
true
2024-01-21T06:06:38Z
false
65.784983
83.051185
65.16158
52.262305
81.610103
67.702805
false
andysalerno_rainbowfish-7B-v10_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/andysalerno/rainbowfish-7B-v10" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">andysalerno/rainbowfish-7B-v10</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__rainbowfish-7B-v10" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
andysalerno/rainbowfish-7B-v10
22a4cd7ecfdafb957ba2233b9c06fccd70663cfa
61.880638
0
7
false
true
true
true
2024-02-13T16:41:43Z
false
61.177474
82.334196
63.262664
49.453359
78.058406
36.997726
false
andysalerno_rainbowfish-7B-v9_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/andysalerno/rainbowfish-7B-v9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">andysalerno/rainbowfish-7B-v9</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__rainbowfish-7B-v9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
andysalerno/rainbowfish-7B-v9
c1b344f0efaacd2309d22dcbe4358a00bdd50f15
61.415877
apache-2.0
0
7
true
true
true
true
2024-02-12T19:08:51Z
false
61.774744
82.433778
63.002979
48.820897
77.663773
34.79909
false
andysalerno_rainbowfish-v6_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/andysalerno/rainbowfish-v6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">andysalerno/rainbowfish-v6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__rainbowfish-v6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
andysalerno/rainbowfish-v6
2b62fc1c6f1105c21ec96f958f0d16d2197517cc
61.640294
0
7
false
true
true
true
2024-02-07T05:08:25Z
false
61.945392
82.513444
62.792091
48.374896
77.900552
36.31539
false
andysalerno_rainbowfish-v7_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/andysalerno/rainbowfish-v7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">andysalerno/rainbowfish-v7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__rainbowfish-v7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
andysalerno/rainbowfish-v7
896039c526d6d5977fb7943743666b4dc2563b3e
62.181885
apache-2.0
1
7
true
true
true
true
2024-02-08T05:03:01Z
false
61.945392
82.523402
63.256318
49.776248
78.137332
37.452616
false
anhnv125_llama-op-v4_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/anhnv125/llama-op-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anhnv125/llama-op-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_anhnv125__llama-op-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
anhnv125/llama-op-v4
6cd644049de2b944beaefcc6aa34965c00e08529
54.336135
0
12
false
true
true
true
2023-10-16T12:48:18Z
false
61.518771
79.207329
57.010962
42.723856
75.927388
9.628506
false
anhnv125_pygmalion-6b-roleplay_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTJForCausalLM
<a target="_blank" href="https://huggingface.co/anhnv125/pygmalion-6b-roleplay" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anhnv125/pygmalion-6b-roleplay</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_anhnv125__pygmalion-6b-roleplay" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
anhnv125/pygmalion-6b-roleplay
e49ed0bde45de0a436bff678ec4872069e8f230c
38.344319
2
6
false
true
true
true
2023-09-09T10:52:17Z
false
40.52901
67.46664
25.730836
32.534485
62.667719
1.137225
false
ankhamun_xxxI-Ixxx_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ankhamun/xxxI-Ixxx" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ankhamun/xxxI-Ixxx</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ankhamun__xxxI-Ixxx" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ankhamun/xxxI-Ixxx
4063a7f7f22b9f6f22cfaf518e85743bdce4dc11
54.562198
apache-2.0
0
7
true
true
true
true
2024-02-08T01:58:44Z
false
54.180887
72.54531
52.022854
54.42192
70.244672
23.957544
false
antiven0m_brugle-rp_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/antiven0m/brugle-rp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">antiven0m/brugle-rp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_antiven0m__brugle-rp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
antiven0m/brugle-rp
02096a89cfa76a0bb9aa331a771edd703674b0c3
null
0
7
false
true
true
true
2024-01-22T00:49:15Z
false
22.696246
25.044812
23.116858
null
49.565904
0
false
antiven0m_finch_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/antiven0m/finch" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">antiven0m/finch</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_antiven0m__finch" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
antiven0m/finch
8dbf40c7be17ddb1b2a07e49c60c180fed741172
73.782824
cc-by-nc-4.0
1
7
true
false
true
true
2024-02-06T14:13:59Z
false
71.587031
87.870942
64.807316
67.957765
84.135754
66.338135
false
antiven0m_reverie-7b_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/antiven0m/reverie-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">antiven0m/reverie-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_antiven0m__reverie-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
antiven0m/reverie-7b
f8d9b2b4cac7f9612b91bd19feb1a50750ddf070
71.386292
cc-by-nc-4.0
0
7
true
false
true
true
2024-04-08T07:07:44Z
false
68.34471
87.163912
65.027694
59.738714
82.083662
65.95906
false
anton-l_gpt-j-tiny-random_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTJForCausalLM
<a target="_blank" href="https://huggingface.co/anton-l/gpt-j-tiny-random" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anton-l/gpt-j-tiny-random</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_anton-l__gpt-j-tiny-random" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
anton-l/gpt-j-tiny-random
feea91564dac0081f73aeb6744979c6cfe553fff
28.918986
1
0
false
true
true
true
2023-10-16T12:58:30Z
false
26.365188
25.7618
24.462019
47.437931
49.486977
0
false
appvoid_palmer-002_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/appvoid/palmer-002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">appvoid/palmer-002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_appvoid__palmer-002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
appvoid/palmer-002
8b79b8c2126483baeb3a503c51cd4ffa9d7c11a6
36.793606
apache-2.0
2
0
true
true
true
true
2024-01-06T06:32:09Z
false
34.47099
59.410476
25.93517
37.06424
62.667719
1.21304
false
appvoid_palmer-002.5_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/appvoid/palmer-002.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">appvoid/palmer-002.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_appvoid__palmer-002.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
appvoid/palmer-002.5
5a987c226e4935167dbbec5565d16c66853a3932
38.860977
apache-2.0
5
1
true
false
true
true
2024-01-25T15:02:22Z
false
37.542662
61.840271
25.211519
40.222951
66.377269
1.97119
false
aqweteddy_mistral_tv-neural-marconroni_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/aqweteddy/mistral_tv-neural-marconroni" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aqweteddy/mistral_tv-neural-marconroni</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_aqweteddy__mistral_tv-neural-marconroni" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aqweteddy/mistral_tv-neural-marconroni
969f7483d768b15998cd57b392ea1a9718de3b28
71.273649
mit
0
7
true
true
true
true
2023-12-29T16:21:31Z
false
69.197952
86.257718
65.070318
60.029639
80.899763
66.186505
false
arcee-ai_Clown-DPO-Extended_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/arcee-ai/Clown-DPO-Extended" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arcee-ai/Clown-DPO-Extended</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_arcee-ai__Clown-DPO-Extended" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
arcee-ai/Clown-DPO-Extended
a6c74a9d47c1c311d8387877f85c4ae0f70eacca
76.405841
apache-2.0
5
8
true
false
true
true
2024-03-18T19:03:09Z
false
73.122867
89.085839
64.521454
78.783135
84.68824
68.23351
false
arcee-ai_Saul-Instruct-Clown-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/arcee-ai/Saul-Instruct-Clown-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arcee-ai/Saul-Instruct-Clown-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_arcee-ai__Saul-Instruct-Clown-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
arcee-ai/Saul-Instruct-Clown-7b
d7954892af5c69c741493618e3830992929196a1
71.898598
apache-2.0
0
7
true
false
true
true
2024-03-18T18:30:03Z
false
68.088737
86.227843
64.405876
63.204593
81.610103
67.854435
false
argilla_CapybaraHermes-2.5-Mistral-7B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/argilla/CapybaraHermes-2.5-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">argilla/CapybaraHermes-2.5-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_argilla__CapybaraHermes-2.5-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
argilla/CapybaraHermes-2.5-Mistral-7B
488b5d3a878dcbadf3f316dca9332f484ffd4e0d
68.143179
apache-2.0
60
7
true
true
true
true
2024-02-05T03:35:04Z
false
65.784983
85.451105
63.12851
56.911948
78.295185
59.287339
false
argilla_DistilabelBeagle14-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/argilla/DistilabelBeagle14-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">argilla/DistilabelBeagle14-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_argilla__DistilabelBeagle14-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
argilla/DistilabelBeagle14-7B
a7c3dec7418b86dc4b6169d349d0f11199a222ab
67.515134
cc-by-nc-4.0
4
7
true
false
true
true
2024-01-23T15:51:38Z
false
71.075085
87.004581
61.273293
68.907988
80.74191
36.087945
false
argilla_distilabeled-Hermes-2.5-Mistral-7B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/argilla/distilabeled-Hermes-2.5-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">argilla/distilabeled-Hermes-2.5-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_argilla__distilabeled-Hermes-2.5-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
argilla/distilabeled-Hermes-2.5-Mistral-7B
71e12bedd29a0d8e8744f32a41aca68769fc99c2
68.417775
0
7
false
true
true
true
2024-01-10T18:17:02Z
false
66.296928
85.15236
63.498273
55.753037
78.926598
60.879454
false
argilla_distilabeled-Marcoro14-7B-slerp_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/argilla/distilabeled-Marcoro14-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">argilla/distilabeled-Marcoro14-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_argilla__distilabeled-Marcoro14-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
argilla/distilabeled-Marcoro14-7B-slerp
baa14c82695e595b5d39f35068898feb6fdceb34
73.633634
cc-by-nc-4.0
10
7
true
false
true
true
2024-01-11T16:05:39Z
false
70.733788
87.472615
65.218791
65.10265
82.083662
71.190296
false
argilla_distilabeled-Marcoro14-7B-slerp-full_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/argilla/distilabeled-Marcoro14-7B-slerp-full" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">argilla/distilabeled-Marcoro14-7B-slerp-full</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_argilla__distilabeled-Marcoro14-7B-slerp-full" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
argilla/distilabeled-Marcoro14-7B-slerp-full
8a4b63ce6161a85d53a5ac9504a758e95ac052dd
73.400769
apache-2.0
1
7
true
false
true
true
2024-01-14T16:42:42Z
false
70.648464
87.55228
65.325369
64.214175
82.004736
70.659591
false
argilla_notus-7b-v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/argilla/notus-7b-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">argilla/notus-7b-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_argilla__notus-7b-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
argilla/notus-7b-v1
89f594b32aea9bf5de0abe3877f20ff302549934
60.223046
mit
118
7
true
true
true
true
2023-11-27T10:20:12Z
false
64.590444
84.783908
63.033082
54.367684
79.400158
15.163002
false
argilla_notus-7b-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/argilla/notus-7b-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">argilla/notus-7b-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_argilla__notus-7b-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
argilla/notus-7b-v1
f23f4cf6cb76402c76e932ead01109191af72a60
63.48999
mit
118
7
true
true
true
true
2023-12-01T22:48:34Z
false
64.590444
84.833698
63.036208
54.349932
79.558011
34.571645
false
argilla_notus-8x7b-experiment_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Unknown
<a target="_blank" href="https://huggingface.co/argilla/notus-8x7b-experiment" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">argilla/notus-8x7b-experiment</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_argilla__notus-8x7b-experiment" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
argilla/notus-8x7b-experiment
86c89d182babd29521a41a54528e5bf8331ed4cd
73.182407
0
46
false
true
true
true
2023-12-18T10:39:45Z
false
70.989761
87.731528
71.334276
65.791173
81.610103
61.637604
false
argilla_notux-8x7b-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/argilla/notux-8x7b-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">argilla/notux-8x7b-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_argilla__notux-8x7b-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
argilla/notux-8x7b-v1
1f8562051647d5537dc950315e74534b363a0812
72.969386
apache-2.0
163
46
true
true
false
true
2023-12-28T11:27:25Z
false
70.648464
87.721569
71.388942
66.208533
80.74191
61.106899
false
argilla_notux-8x7b-v1-epoch-2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Unknown
<a target="_blank" href="https://huggingface.co/argilla/notux-8x7b-v1-epoch-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">argilla/notux-8x7b-v1-epoch-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_argilla__notux-8x7b-v1-epoch-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
argilla/notux-8x7b-v1-epoch-2
bd3924498c3ae041334be5018cd912b6537a633c
73.046057
0
0
false
true
true
true
2024-01-05T21:53:54Z
false
70.648464
87.801235
71.42649
65.967741
82.083662
60.348749
false
aridoverrun_Foxglove_7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/aridoverrun/Foxglove_7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aridoverrun/Foxglove_7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_aridoverrun__Foxglove_7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
aridoverrun/Foxglove_7B
170b019aefdd0fd47c616d0826550068fd743062
68.772117
0
7
false
true
true
true
2024-04-07T12:47:22Z
false
67.832765
86.566421
62.890803
69.642503
80.74191
44.958302
false
ariellee_SuperPlatty-30B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/ariellee/SuperPlatty-30B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ariellee/SuperPlatty-30B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ariellee__SuperPlatty-30B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ariellee/SuperPlatty-30B
017e1c32bca060107337dbf26db2044a7caa56f2
59.298745
0
32
false
true
true
true
2023-10-16T12:48:18Z
false
65.784983
83.947421
62.568701
53.515579
80.347277
9.628506
false
arlineka_Brunhilde-13b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/arlineka/Brunhilde-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arlineka/Brunhilde-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_arlineka__Brunhilde-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
arlineka/Brunhilde-13b
ce50fccfb850fc07618c6d215823b754b42346ed
56.202967
cc-by-nc-4.0
0
13
true
false
true
true
2024-02-14T10:32:54Z
false
60.494881
83.489345
56.177154
52.350055
75.532755
9.173616
false
arlineka_Brunhilde-13b-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/arlineka/Brunhilde-13b-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arlineka/Brunhilde-13b-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_arlineka__Brunhilde-13b-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
arlineka/Brunhilde-13b-v1
e13977c7951d5d8cd77d301f75a7a3822c4800ee
57.880586
cc-by-nc-4.0
0
13
true
false
true
true
2024-02-14T14:39:26Z
false
61.09215
83.578968
55.321945
51.982428
75.217048
20.090978
false
arlineka_Brunhilde-13b-v3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/arlineka/Brunhilde-13b-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arlineka/Brunhilde-13b-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_arlineka__Brunhilde-13b-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
arlineka/Brunhilde-13b-v3
7b54e7e30e7156586d908857910e4fa502c2fcf5
54.64964
cc-by-nc-4.0
0
13
true
false
false
true
2024-04-01T22:00:23Z
false
60.153584
84.017128
55.031631
52.985085
74.269929
1.440485
false
arlineka_Brunhilde-2x7b-MOE-DPO-v.01.5_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/arlineka/Brunhilde-2x7b-MOE-DPO-v.01.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arlineka/Brunhilde-2x7b-MOE-DPO-v.01.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_arlineka__Brunhilde-2x7b-MOE-DPO-v.01.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
arlineka/Brunhilde-2x7b-MOE-DPO-v.01.5
d9bb402315f47764bf0f6002e513cd7e89c7c804
71.811325
cc-by-nc-4.0
0
12
true
true
false
true
2024-02-13T15:05:22Z
false
69.539249
87.024497
64.929484
65.472681
80.899763
63.002274
false
arlineka_CatNyanster-34b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/arlineka/CatNyanster-34b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arlineka/CatNyanster-34b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_arlineka__CatNyanster-34b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
arlineka/CatNyanster-34b
96309889e5b06d9e6f183b85a727a0f5631583e0
74.027732
0
34
false
true
true
true
2024-04-04T02:31:05Z
false
68.259386
85.690102
78.178384
56.630433
84.293607
71.114481
false
arlineka_KittyNyanster-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/arlineka/KittyNyanster-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arlineka/KittyNyanster-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_arlineka__KittyNyanster-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
arlineka/KittyNyanster-v1
22bc4acbba8c623d21dd254036a909deb98c7602
66.024119
cc-by-nc-4.0
4
7
false
true
true
true
2024-04-03T02:59:53Z
false
65.102389
84.724159
64.386056
57.86644
77.742699
46.322972
false
arshadshk_Mistral-Hinglish-7B-Instruct-v0.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/arshadshk/Mistral-Hinglish-7B-Instruct-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arshadshk/Mistral-Hinglish-7B-Instruct-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_arshadshk__Mistral-Hinglish-7B-Instruct-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
arshadshk/Mistral-Hinglish-7B-Instruct-v0.2
987a8027701ba1bda62ae86a57051b8b18ce7ef3
44.087559
apache-2.0
4
7
true
true
true
true
2024-03-14T07:16:58Z
false
40.358362
71.977694
23.116858
49.961359
66.298343
12.812737
false
arvindanand_Deepseek-Wizard-33B-slerp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/arvindanand/Deepseek-Wizard-33B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arvindanand/Deepseek-Wizard-33B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_arvindanand__Deepseek-Wizard-33B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
arvindanand/Deepseek-Wizard-33B-slerp
07c9bc1f8f367238e8e69578e509c115cd9dd917
34.28637
apache-2.0
0
17
true
false
true
true
2024-04-10T03:41:25Z
false
31.399317
36.934874
37.798326
44.810642
54.775059
0
false
arvindanand_ValidateAI-2-33B-AT_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/arvindanand/ValidateAI-2-33B-AT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arvindanand/ValidateAI-2-33B-AT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_arvindanand__ValidateAI-2-33B-AT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
arvindanand/ValidateAI-2-33B-AT
3db1dbd40e44f811e8cb6c8c93df65c6a6d58cb2
49.74472
apache-2.0
0
33
true
false
true
true
2024-04-11T07:57:05Z
false
45.989761
62.885879
43.517008
44.442148
62.588792
39.044731
false
arvindanand_ValidateAI-3-33B-Ties_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/arvindanand/ValidateAI-3-33B-Ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arvindanand/ValidateAI-3-33B-Ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_arvindanand__ValidateAI-3-33B-Ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
arvindanand/ValidateAI-3-33B-Ties
f556cfcf87f5ab1a88a296a9fa155be369c2f291
49.478706
apache-2.0
0
33
true
false
true
true
2024-04-11T19:28:09Z
false
45.989761
61.710815
44.043442
42.262985
63.062352
39.802881
false
arvindanand_ValidateAI-33B-slerp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/arvindanand/ValidateAI-33B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arvindanand/ValidateAI-33B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_arvindanand__ValidateAI-33B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
arvindanand/ValidateAI-33B-slerp
6a470f4b9f1370f40a5dd081c0cd0f28e27e9c22
34.768626
apache-2.0
0
17
true
false
true
true
2024-04-10T03:41:06Z
false
31.143345
36.825334
40.049488
45.660677
54.932912
0
false
ashercn97_giraffe-7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ashercn97/giraffe-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ashercn97/giraffe-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ashercn97__giraffe-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ashercn97/giraffe-7b
9af88449bed5be4709befcfbbba123ee75805479
45.286665
null
0
7
false
true
true
true
2023-10-16T12:48:18Z
false
47.1843
75.532762
38.889179
38.478373
68.981847
2.653525
false
ashercn97_manatee-7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ashercn97/manatee-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ashercn97/manatee-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ashercn97__manatee-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ashercn97/manatee-7b
e66094c43ffe6c5b3f4164cd4ba048d3bc422fd0
51.843961
null
2
7
false
true
true
true
2023-10-16T12:46:18Z
false
54.522184
78.948417
49.26361
46.772049
74.506709
7.050796
false
athirdpath_Iambe-20b-DARE-v2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/athirdpath/Iambe-20b-DARE-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">athirdpath/Iambe-20b-DARE-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_athirdpath__Iambe-20b-DARE-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
athirdpath/Iambe-20b-DARE-v2
02bd8edd30a5ddd1eede94c19a6ae160842a2f9f
61.992099
cc-by-nc-4.0
6
19
true
true
true
true
2023-12-05T07:21:13Z
false
62.798635
84.534953
60.450219
53.853639
77.03236
33.28279
false
athirdpath_NSFW_DPO_Noromaid-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/athirdpath/NSFW_DPO_Noromaid-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">athirdpath/NSFW_DPO_Noromaid-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_athirdpath__NSFW_DPO_Noromaid-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
athirdpath/NSFW_DPO_Noromaid-7b
51b4408a40736e18f69d932cb403811558428378
61.588301
cc-by-nc-4.0
1
7
true
true
true
true
2023-12-13T02:37:21Z
false
62.627986
84.49512
63.336448
44.993493
78.216259
35.8605
false
athirdpath_Orca-2-13b-Alpaca-Uncensored_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/athirdpath/Orca-2-13b-Alpaca-Uncensored" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">athirdpath/Orca-2-13b-Alpaca-Uncensored</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_athirdpath__Orca-2-13b-Alpaca-Uncensored" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
athirdpath/Orca-2-13b-Alpaca-Uncensored
2fdbef532345da9eba9b9f4b8aaef6ea11b664fe
61.632152
other
2
13
true
true
true
true
2024-02-17T12:59:44Z
false
61.09215
79.267078
60.130236
53.589875
77.426993
38.286581
false
augmxnt_shisa-7b-v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/augmxnt/shisa-7b-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">augmxnt/shisa-7b-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_augmxnt__shisa-7b-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
augmxnt/shisa-7b-v1
131c2f3bf4955d1e2b6762380132bdd8688c0646
55.010334
apache-2.0
28
7
true
true
true
true
2023-12-07T15:00:07Z
false
56.143345
78.629755
23.116858
52.491202
78.058406
41.622441
false
augmxnt_shisa-base-7b-v1_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/augmxnt/shisa-base-7b-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">augmxnt/shisa-base-7b-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_augmxnt__shisa-base-7b-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
augmxnt/shisa-base-7b-v1
5aa465caca707816a4bb36b4980aef5d102d76fb
51.640609
apache-2.0
15
7
true
true
true
true
2023-12-07T15:00:40Z
false
52.303754
77.633937
23.116858
42.396642
78.531965
35.8605
false
augmxnt_shisa-gamma-7b-v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/augmxnt/shisa-gamma-7b-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">augmxnt/shisa-gamma-7b-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_augmxnt__shisa-gamma-7b-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
augmxnt/shisa-gamma-7b-v1
49bf4a58453d191845668b8ff17e4b8f0e9ccae6
55.504757
apache-2.0
13
7
true
true
true
true
2024-01-04T19:41:03Z
false
53.156997
77.295359
55.228996
50.72739
73.875296
22.744503
false
augtoma_qCammel-13_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/augtoma/qCammel-13" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">augtoma/qCammel-13</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_augtoma__qCammel-13" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
augtoma/qCammel-13
af473e64f6a4fa02a7e24ee7679eea9505eb179d
56.049668
other
11
0
true
true
true
true
2023-09-09T10:52:17Z
false
60.836177
83.658634
56.727354
47.539423
76.164167
11.372252
false
augtoma_qCammel-70_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/augtoma/qCammel-70" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">augtoma/qCammel-70</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_augtoma__qCammel-70" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
augtoma/qCammel-70
cf1e917e42fd1e56ee1edef7ee1a98cbe705c18c
66.312734
0
68
false
true
true
true
2023-09-09T10:52:17Z
false
68.34471
87.870942
70.178923
57.468736
84.293607
29.719484
false
augtoma_qCammel-70-x_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/augtoma/qCammel-70-x" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">augtoma/qCammel-70-x</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_augtoma__qCammel-70-x" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
augtoma/qCammel-70-x
cf1e917e42fd1e56ee1edef7ee1a98cbe705c18c
66.312734
other
27
0
true
true
true
true
2023-09-09T10:52:17Z
false
68.34471
87.870942
70.178923
57.468736
84.293607
29.719484
false
augtoma_qCammel-70v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/augtoma/qCammel-70v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">augtoma/qCammel-70v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_augtoma__qCammel-70v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
augtoma/qCammel-70v1
cf1e917e42fd1e56ee1edef7ee1a98cbe705c18c
66.312734
0
68
false
true
true
true
2023-09-09T10:52:17Z
false
68.34471
87.870942
70.178923
57.468736
84.293607
29.719484
false
augtoma_qCammel-70x_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/augtoma/qCammel-70x" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">augtoma/qCammel-70x</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_augtoma__qCammel-70x" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
augtoma/qCammel-70x
cf1e917e42fd1e56ee1edef7ee1a98cbe705c18c
66.312734
0
68
false
true
true
true
2023-09-09T10:52:17Z
false
68.34471
87.870942
70.178923
57.468736
84.293607
29.719484
false
augtoma_qCammel70_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/augtoma/qCammel70" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">augtoma/qCammel70</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_augtoma__qCammel70" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
augtoma/qCammel70
cf1e917e42fd1e56ee1edef7ee1a98cbe705c18c
66.312734
0
68
false
true
true
true
2023-09-09T10:52:17Z
false
68.34471
87.870942
70.178923
57.468736
84.293607
29.719484
false
ausboss_llama-13b-supercot_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ausboss/llama-13b-supercot" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ausboss/llama-13b-supercot</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ausboss__llama-13b-supercot" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ausboss/llama-13b-supercot
f6953fa162b487a3d4c6bdc7b7951e09576c2ae5
52.440652
other
8
13
true
true
true
true
2023-09-09T10:52:17Z
false
56.05802
81.706831
45.360534
48.546568
75.769534
7.202426
false
ausboss_llama-30b-supercot_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ausboss/llama-30b-supercot" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ausboss/llama-30b-supercot</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ausboss__llama-30b-supercot" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ausboss/llama-30b-supercot
dc9d81f454d286ea040c5cd45b058aecaa51c13e
58.730382
126
30
false
true
true
true
2023-10-16T12:46:18Z
false
64.846416
85.082653
56.560105
53.958589
80.031571
11.902957
false
ausboss_llama7b-wizardlm-unfiltered_float16
float16
?
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ausboss/llama7b-wizardlm-unfiltered" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ausboss/llama7b-wizardlm-unfiltered</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ausboss__llama7b-wizardlm-unfiltered" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ausboss/llama7b-wizardlm-unfiltered
2123beec77083c414b2ae51dd25b7a870b0b936c
46.943734
6
7
false
true
true
true
2023-09-09T10:52:17Z
false
52.986348
77.89285
36.411613
37.753376
72.296764
4.321456
false
automerger_Experiment26Yamshadow-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/automerger/Experiment26Yamshadow-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">automerger/Experiment26Yamshadow-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_automerger__Experiment26Yamshadow-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
automerger/Experiment26Yamshadow-7B
a0dfb3964cc2ba8dc398f9a7d1386bead4299697
76.656162
apache-2.0
0
7
true
false
true
true
2024-04-05T22:40:08Z
false
72.78157
89.046007
64.555832
78.12058
84.92502
70.507961
false
automerger_Experiment27Neuralsirkrishna-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/automerger/Experiment27Neuralsirkrishna-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">automerger/Experiment27Neuralsirkrishna-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_automerger__Experiment27Neuralsirkrishna-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
automerger/Experiment27Neuralsirkrishna-7B
6be791ab6dff0e9b0e222743d2973717ef5250c1
76.413546
apache-2.0
0
7
true
false
true
true
2024-03-11T18:44:53Z
false
73.208191
89.036049
64.624922
77.395289
84.846093
69.370735
false