eval_name
stringlengths
9
97
Precision
stringclasses
5 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
3 values
Architecture
stringclasses
53 values
Model
stringlengths
355
611
fullname
stringlengths
4
89
Model sha
stringlengths
0
40
Average ⬆️
float64
27
81.3
Hub License
stringclasses
35 values
Hub ❤️
int64
0
4.88k
#Params (B)
int64
0
238
Available on the hub
bool
2 classes
Merged
bool
2 classes
MoE
bool
2 classes
Flagged
bool
1 class
date
stringlengths
0
26
Chat Template
bool
2 classes
ARC
float64
19.7
87.5
HellaSwag
float64
20.7
92.8
MMLU
float64
17.8
89.4
TruthfulQA
float64
27.9
82.3
Winogrande
float64
47.2
91.5
GSM8K
float64
0
88.2
Maintainers Choice
bool
2 classes
maximuslee07_llama-2-13b-rockwellautomation_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Delta
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/maximuslee07/llama-2-13b-rockwellautomation" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maximuslee07/llama-2-13b-rockwellautomation</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_maximuslee07__llama-2-13b-rockwellautomation" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maximuslee07/llama-2-13b-rockwellautomation
2bec12a875dd8cb22550c02082ae81e138018ebe
null
llama2
0
13
true
true
true
true
2024-01-03T20:31:09Z
false
28.156997
25.771759
25.142023
null
49.802684
0
false
maximuslee07_llama-2-7b-rockwell-final_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/maximuslee07/llama-2-7b-rockwell-final" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maximuslee07/llama-2-7b-rockwell-final</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_maximuslee07__llama-2-7b-rockwell-final" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maximuslee07/llama-2-7b-rockwell-final
de4cfe99e9e3db62733b40f48b2b11faf9abe4bf
50.551634
llama2
0
6
false
true
true
true
2023-10-16T12:46:18Z
false
52.730375
79.097789
47.881701
47.210003
68.429361
7.960576
false
mayacinka_Buttercup-7b-dpo-slerp_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mayacinka/Buttercup-7b-dpo-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mayacinka/Buttercup-7b-dpo-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_mayacinka__Buttercup-7b-dpo-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mayacinka/Buttercup-7b-dpo-slerp
a9f4d04b59d764a45fabac9dd3d7f72b795967f0
76.189451
0
7
false
true
true
true
2024-02-17T21:42:12Z
false
72.696246
89.085839
64.500196
77.171411
84.767167
68.915845
false
mayacinka_Buttercup-7b-dpo-ties_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mayacinka/Buttercup-7b-dpo-ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mayacinka/Buttercup-7b-dpo-ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_mayacinka__Buttercup-7b-dpo-ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mayacinka/Buttercup-7b-dpo-ties
608d7998c1b8f4707e065642a7cfa3d0ddb80100
76.189451
0
7
false
true
true
true
2024-02-18T05:23:16Z
false
72.696246
89.085839
64.500196
77.171411
84.767167
68.915845
false
mayacinka_CalmeRity-stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mayacinka/CalmeRity-stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mayacinka/CalmeRity-stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_mayacinka__CalmeRity-stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mayacinka/CalmeRity-stock
ee49960ec6f896a9d287b8117aaa5f543a857a60
76.588405
0
7
false
true
true
true
2024-04-17T05:04:48Z
false
73.208191
89.145589
64.446751
77.966808
85.240726
69.522365
false
mayacinka_NeuralZephyr-Beagle-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mayacinka/NeuralZephyr-Beagle-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mayacinka/NeuralZephyr-Beagle-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_mayacinka__NeuralZephyr-Beagle-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mayacinka/NeuralZephyr-Beagle-7B
91fb2de32d29aec936e54c6edeea4ae778259b00
71.567377
apache-2.0
0
7
true
false
true
true
2024-02-16T21:43:38Z
false
68.600683
86.377216
64.666885
65.165768
81.136543
63.457165
false
mayacinka_Open-StaMis-stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mayacinka/Open-StaMis-stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mayacinka/Open-StaMis-stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_mayacinka__Open-StaMis-stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mayacinka/Open-StaMis-stock
d52b17ca083f993f1e18cc11b950d31a3cd67670
52.965203
apache-2.0
0
7
true
false
true
true
2024-04-17T05:05:31Z
false
59.300341
69.637522
51.780751
45.428592
74.585635
17.058378
false
mayacinka_Open-StaMis-v02-stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mayacinka/Open-StaMis-v02-stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mayacinka/Open-StaMis-v02-stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_mayacinka__Open-StaMis-v02-stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mayacinka/Open-StaMis-v02-stock
90755db307bc46ca07553689a15f0dbee2179d70
54.138111
apache-2.0
0
7
true
false
true
true
2024-04-17T05:06:05Z
false
60.580205
74.646485
52.276493
43.21805
75.532755
18.574678
false
mayacinka_West-Ramen-7Bx4_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/mayacinka/West-Ramen-7Bx4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mayacinka/West-Ramen-7Bx4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_mayacinka__West-Ramen-7Bx4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mayacinka/West-Ramen-7Bx4
bc62dcbb054c7b6368d85eda9f2d41750e4d69f9
69.334545
apache-2.0
0
24
true
true
true
true
2024-02-29T23:55:32Z
false
67.576792
85.520813
62.693408
61.002302
81.21547
57.998484
false
mayacinka_chatty-djinn-14B_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mayacinka/chatty-djinn-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mayacinka/chatty-djinn-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_mayacinka__chatty-djinn-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mayacinka/chatty-djinn-14B
3e38656d8a5110f0bf05e7cb8cec2ae8043656c4
72.082541
apache-2.0
0
13
true
false
true
true
2024-02-27T05:19:10Z
false
70.392491
86.446923
64.399358
67.570571
83.109708
60.576194
false
mayacinka_frankencup-dpo_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mayacinka/frankencup-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mayacinka/frankencup-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_mayacinka__frankencup-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mayacinka/frankencup-dpo
9248c7340053361655743f40acd4b9c1b5d0815d
48.25672
0
14
false
true
true
true
2024-02-17T22:28:20Z
false
42.662116
60.545708
62.211228
50.719532
73.401736
0
false
mayacinka_yam-jom-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mayacinka/yam-jom-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mayacinka/yam-jom-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_mayacinka__yam-jom-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mayacinka/yam-jom-7B
fdd98b8000db4e2a9112184fa384de812069b5cd
76.600566
apache-2.0
0
7
true
false
true
true
2024-03-02T15:56:11Z
false
73.37884
89.145589
64.512044
78.043724
84.92502
69.59818
false
mayacinka_yam-jom-7B-dare_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mayacinka/yam-jom-7B-dare" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mayacinka/yam-jom-7B-dare</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_mayacinka__yam-jom-7B-dare" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mayacinka/yam-jom-7B-dare
5d79006083e269006e4cfdf8ebe2e902a258e6f3
76.600154
apache-2.0
0
7
true
false
true
true
2024-03-07T14:40:13Z
false
73.37884
89.13563
64.376771
78.037964
84.846093
69.825625
false
mayacinka_yam-jom-7B-slerp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mayacinka/yam-jom-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mayacinka/yam-jom-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_mayacinka__yam-jom-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mayacinka/yam-jom-7B-slerp
24f3ae950139f9962e34003d567ba2825ec39e64
76.452763
apache-2.0
0
7
true
false
true
true
2024-03-03T06:06:48Z
false
72.696246
89.016132
64.640895
77.773628
84.68824
69.90144
false
mayacinka_yam-jom-7B-ties_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mayacinka/yam-jom-7B-ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mayacinka/yam-jom-7B-ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_mayacinka__yam-jom-7B-ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mayacinka/yam-jom-7B-ties
f57717445a39fdaf5cae2eafb2c46576e4481e6d
76.444087
apache-2.0
0
7
true
false
true
true
2024-03-03T05:51:09Z
false
73.208191
89.046007
64.773012
77.508746
84.530387
69.59818
false
mayacinka_yam-sam-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mayacinka/yam-sam-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mayacinka/yam-sam-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_mayacinka__yam-sam-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mayacinka/yam-sam-7B
c031f5e40b3e220c719e0430f63b6b11794084ae
74.57799
apache-2.0
0
7
true
false
true
true
2024-03-02T19:04:42Z
false
70.904437
87.920733
65.392585
71.303557
83.030781
68.915845
false
maywell_Mini_Synatra_SFT_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/maywell/Mini_Synatra_SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maywell/Mini_Synatra_SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__Mini_Synatra_SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maywell/Mini_Synatra_SFT
fc042f671dc0c94b21a6107eda75a6f9c8d44f2d
63.385479
cc-by-sa-4.0
2
0
true
true
true
true
2023-11-25T01:24:00Z
false
62.457338
83.439554
61.196743
53.672193
74.664562
44.882487
false
maywell_PiVoT-0.1-Evil-a_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/maywell/PiVoT-0.1-Evil-a" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maywell/PiVoT-0.1-Evil-a</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__PiVoT-0.1-Evil-a" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maywell/PiVoT-0.1-Evil-a
b6e20287ba4156f06b4288d4003acc677040527f
59.164905
cc-by-sa-4.0
38
7
true
true
true
true
2023-11-26T15:34:15Z
false
59.641638
81.477793
58.936292
39.228328
75.295975
40.409401
false
maywell_PiVoT-0.1-early_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/maywell/PiVoT-0.1-early" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maywell/PiVoT-0.1-early</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__PiVoT-0.1-early" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maywell/PiVoT-0.1-early
6eeae58a1a292a1d7f989952a07aead6d5da3c69
64.579979
cc-by-sa-4.0
7
7
true
true
true
true
2023-11-24T07:39:19Z
false
62.457338
82.97152
61.018416
62.887562
73.717443
44.427597
false
maywell_PiVoT-10.7B-Mistral-v0.2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/maywell/PiVoT-10.7B-Mistral-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maywell/PiVoT-10.7B-Mistral-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__PiVoT-10.7B-Mistral-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maywell/PiVoT-10.7B-Mistral-v0.2
a496457d0743b6030ffbb96dad2dc6a62d143943
64.248383
cc-by-sa-4.0
5
10
true
true
true
true
2023-12-16T03:28:51Z
false
63.31058
81.676957
59.859503
58.231093
80.031571
42.380591
false
maywell_PiVoT-MoE_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/maywell/PiVoT-MoE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maywell/PiVoT-MoE</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__PiVoT-MoE" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maywell/PiVoT-MoE
5d1159dd60ec2cc92dbc52508430e620b6adbdaa
63.037
cc-by-nc-4.0
8
36
true
true
false
true
2023-12-22T03:26:14Z
false
63.90785
83.519219
60.713972
54.638393
76.322021
39.120546
false
maywell_PiVoT-SOLAR-10.7B-RP_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/maywell/PiVoT-SOLAR-10.7B-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maywell/PiVoT-SOLAR-10.7B-RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__PiVoT-SOLAR-10.7B-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maywell/PiVoT-SOLAR-10.7B-RP
348a5ccfc4c8c9032ae6234a8fca72110ed4e5ee
66.418361
cc-by-sa-4.0
7
10
true
true
true
true
2023-12-22T03:27:04Z
false
65.102389
81.826329
64.259281
56.540078
76.953433
53.828658
false
maywell_PiVoT-SUS-RP_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/maywell/PiVoT-SUS-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maywell/PiVoT-SUS-RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__PiVoT-SUS-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maywell/PiVoT-SUS-RP
1b3a5c98381f37a2ec97ce80d1d88d472a7d1802
72.572399
apache-2.0
5
34
true
true
true
true
2024-01-15T10:37:43Z
false
66.552901
84.22625
76.234695
54.566099
83.346488
70.507961
false
maywell_Synatra-10.7B-v0.4_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/maywell/Synatra-10.7B-v0.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maywell/Synatra-10.7B-v0.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__Synatra-10.7B-v0.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maywell/Synatra-10.7B-v0.4
ae32ccb01cc971cfb36370876bf8981db243b2a3
65.48424
cc-by-sa-4.0
9
10
true
true
true
true
2023-12-28T13:59:59Z
false
64.931741
82.473611
62.501311
51.113987
81.846882
50.037908
false
maywell_Synatra-11B-Testbench_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Unknown
<a target="_blank" href="https://huggingface.co/maywell/Synatra-11B-Testbench" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maywell/Synatra-11B-Testbench</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__Synatra-11B-Testbench" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maywell/Synatra-11B-Testbench
9399ea6c2a1d955e31d6b4d68b2b86115aea0e59
56.172446
0
11
false
true
true
true
2023-10-15T12:35:41Z
false
57.337884
78.65963
55.55707
51.969845
75.769534
17.740713
false
maywell_Synatra-7B-v0.3-RP_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/maywell/Synatra-7B-v0.3-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maywell/Synatra-7B-v0.3-RP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__Synatra-7B-v0.3-RP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maywell/Synatra-7B-v0.3-RP
372f6e0ab2c20b93e0c42218f76a71a4f9bb282e
59.26097
cc-by-nc-4.0
20
7
true
true
true
true
2023-11-16T02:07:02Z
false
62.201365
82.294364
60.799918
52.637913
76.479874
21.152388
false
maywell_Synatra-7B-v0.3-dpo_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/maywell/Synatra-7B-v0.3-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maywell/Synatra-7B-v0.3-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__Synatra-7B-v0.3-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maywell/Synatra-7B-v0.3-dpo
405a4f1e6513cd1b8de5eb4e003bb49cc86d1f8a
60.546416
cc-by-sa-4.0
23
7
true
true
true
true
2023-11-16T02:06:48Z
false
62.798635
82.583151
61.462928
56.460587
76.243094
23.730099
false
maywell_Synatra-RP-Orca-2-7b-v0.1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/maywell/Synatra-RP-Orca-2-7b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maywell/Synatra-RP-Orca-2-7b-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__Synatra-RP-Orca-2-7b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maywell/Synatra-RP-Orca-2-7b-v0.1
da80bc823c407c28c464cc0547a8ed9e0ca82f79
59.649811
apache-2.0
6
6
true
true
true
true
2023-11-25T01:18:33Z
false
57.679181
77.365067
56.096451
52.521282
74.585635
39.651251
false
maywell_Synatra-RP-Orca-2-7b-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/maywell/Synatra-RP-Orca-2-7b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maywell/Synatra-RP-Orca-2-7b-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__Synatra-RP-Orca-2-7b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maywell/Synatra-RP-Orca-2-7b-v0.1
da80bc823c407c28c464cc0547a8ed9e0ca82f79
59.554117
apache-2.0
6
6
true
true
true
true
2024-01-07T12:44:33Z
false
57.423208
77.305318
56.120729
52.548044
74.427782
39.499621
false
maywell_Synatra-V0.1-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/maywell/Synatra-V0.1-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maywell/Synatra-V0.1-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__Synatra-V0.1-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maywell/Synatra-V0.1-7B
7ee3416f31a3c7e8d5ab4295ac1b641075f36345
55.857494
0
7
false
true
true
true
2023-10-16T12:46:18Z
false
55.290102
76.628162
55.291469
55.756267
72.770324
19.408643
false
maywell_Synatra-V0.1-7B-Instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/maywell/Synatra-V0.1-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maywell/Synatra-V0.1-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__Synatra-V0.1-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maywell/Synatra-V0.1-7B-Instruct
7ee3416f31a3c7e8d5ab4295ac1b641075f36345
55.857494
cc-by-nc-4.0
16
7
true
true
true
true
2023-11-06T10:31:15Z
false
55.290102
76.628162
55.291469
55.756267
72.770324
19.408643
false
maywell_TinyLlama-MoE-Chat_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/maywell/TinyLlama-MoE-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maywell/TinyLlama-MoE-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__TinyLlama-MoE-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maywell/TinyLlama-MoE-Chat
2d786c9077b949d7ee3f5201813d7edccc7bd2da
37.809553
0
6
true
true
false
true
2024-01-05T15:42:07Z
false
34.641638
59.221271
29.901243
39.37026
62.509866
1.21304
false
maywell_TinyLlama-MoE-Chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/maywell/TinyLlama-MoE-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maywell/TinyLlama-MoE-Chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__TinyLlama-MoE-Chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maywell/TinyLlama-MoE-Chat
2d786c9077b949d7ee3f5201813d7edccc7bd2da
37.710916
0
6
true
true
false
true
2024-01-05T15:42:10Z
false
34.726962
59.290978
29.714167
39.353634
62.194159
0.985595
false
maywell_TinyLlama-MoE-Chat-0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/maywell/TinyLlama-MoE-Chat-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maywell/TinyLlama-MoE-Chat-0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__TinyLlama-MoE-Chat-0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maywell/TinyLlama-MoE-Chat-0.1
2ebc34217cafbff7812e85fd59c682550bbeb4f8
36.704401
0
6
true
true
false
true
2024-01-07T23:18:36Z
false
34.385666
56.721769
29.360747
37.815267
59.668508
2.27445
false
maywell_TinyWand-DPO_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/maywell/TinyWand-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maywell/TinyWand-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__TinyWand-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maywell/TinyWand-DPO
7bf42524d664785d92243576b1f7d3b3ed463819
35.126643
apache-2.0
2
1
true
true
true
true
2024-01-04T06:53:10Z
false
31.65529
50.418243
26.217084
45.798805
54.775059
1.895375
false
maywell_TinyWand-SFT_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/maywell/TinyWand-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maywell/TinyWand-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__TinyWand-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maywell/TinyWand-SFT
ac1dffae8e8a8324fdac7a266a8ce82e6d033577
34.605215
apache-2.0
5
1
true
true
true
true
2024-01-04T06:53:15Z
false
31.399317
49.960167
25.978828
43.076281
55.169692
2.047005
false
maywell_kiqu-70b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/maywell/kiqu-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maywell/kiqu-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__kiqu-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maywell/kiqu-70b
c8ad8ee000e4e042d80e4cf53fb6d0815d7743dd
75.29353
cc-by-sa-4.0
26
68
true
true
true
true
2024-02-18T14:04:36Z
false
72.098976
87.940649
74.930509
63.483995
84.846093
68.460955
false
maywell_koOpenChat-sft_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/maywell/koOpenChat-sft" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">maywell/koOpenChat-sft</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__koOpenChat-sft" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
maywell/koOpenChat-sft
47472b36e181694422564b130ee075ffa596537d
58.614215
cc-by-sa-4.0
5
0
true
true
true
true
2023-11-16T02:06:39Z
false
59.812287
78.729337
61.317238
51.240492
76.400947
24.184989
false
medalpaca_medalpaca-7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/medalpaca/medalpaca-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">medalpaca/medalpaca-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_medalpaca__medalpaca-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
medalpaca/medalpaca-7b
b57b9f5ff34059e485b769973d023021fc66a8f7
48.445508
cc
65
7
true
true
true
true
2023-10-16T12:48:18Z
false
54.095563
80.422227
41.468622
40.462244
71.191792
3.0326
false
mediocredev_open-llama-3b-v2-chat_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/mediocredev/open-llama-3b-v2-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mediocredev/open-llama-3b-v2-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_mediocredev__open-llama-3b-v2-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mediocredev/open-llama-3b-v2-chat
0d171b62a41b2d249cd2ff235b66638e3a894c98
40.928981
apache-2.0
2
3
true
true
true
true
2023-12-22T08:03:10Z
false
40.614334
70.30472
28.732634
37.835409
65.509077
2.57771
false
mediocredev_open-llama-3b-v2-instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/mediocredev/open-llama-3b-v2-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mediocredev/open-llama-3b-v2-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_mediocredev__open-llama-3b-v2-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mediocredev/open-llama-3b-v2-instruct
4d50e134af1d9806cbdf6bc90795b44ae689deca
42.019935
apache-2.0
4
3
true
true
true
true
2023-12-16T00:46:43Z
false
38.481229
70.244971
39.691213
37.956341
65.745856
0
false
meraGPT_mera-mix-4x7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/meraGPT/mera-mix-4x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meraGPT/mera-mix-4x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_meraGPT__mera-mix-4x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meraGPT/mera-mix-4x7B
3fbb682586a5c9588aafedb2fda5cd99df9f3192
75.911422
apache-2.0
16
24
true
true
false
true
2024-04-15T12:16:47Z
false
72.952218
89.165505
64.436455
77.168307
85.635359
66.11069
false
mergedlm_zephyrnotus-11b-alpha_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mergedlm/zephyrnotus-11b-alpha" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mergedlm/zephyrnotus-11b-alpha</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_mergedlm__zephyrnotus-11b-alpha" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mergedlm/zephyrnotus-11b-alpha
a6f74e800b6c77261a1d212bb3e6b2752cbedef9
59.261275
0
10
false
true
true
true
2023-12-02T23:17:39Z
false
61.348123
82.802231
60.66535
57.216809
76.400947
17.134193
false
mervinpraison_tamil-large-language-model-7b-v1.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/mervinpraison/tamil-large-language-model-7b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mervinpraison/tamil-large-language-model-7b-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_mervinpraison__tamil-large-language-model-7b-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mervinpraison/tamil-large-language-model-7b-v1.0
b07baafc06099b5835118213e79768a60f4a8973
62.922262
apache-2.0
5
8
true
true
true
true
2024-03-11T06:13:46Z
false
60.153584
82.214698
63.896995
45.089141
77.505919
48.673237
false
meta-llama_Llama-2-13b-chat-hf_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-13b-chat-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-13b-chat-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_meta-llama__Llama-2-13b-chat-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-2-13b-chat-hf
f848cf15ab9a51ae5735ab28120a9a0773eeb541
54.91333
llama2
987
13
true
true
true
true
2023-09-09T10:52:17Z
false
59.044369
81.935869
54.636271
44.117946
74.506709
15.238817
true
meta-llama_Llama-2-13b-hf_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-13b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-13b-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_meta-llama__Llama-2-13b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-2-13b-hf
7da18fb10421c3ae2a1eb92815bad75e84816e35
55.685848
llama2
554
13
true
true
true
true
2023-09-09T10:52:17Z
false
59.385666
82.125075
55.771036
37.375264
76.637727
22.820318
true
meta-llama_Llama-2-70b-chat-hf_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-70b-chat-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-70b-chat-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_meta-llama__Llama-2-70b-chat-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-2-70b-chat-hf
7f54101c0fbb67a8143ca23eb8bd09b71f269c74
62.395586
llama2
2,117
68
true
true
true
true
2023-09-09T10:52:17Z
false
64.590444
85.879307
63.90702
52.804732
80.50513
26.686884
true
meta-llama_Llama-2-70b-hf_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-70b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-70b-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_meta-llama__Llama-2-70b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-2-70b-hf
ed7b07231238f836b99bf45701b9a0063576b194
67.867804
llama2
811
68
true
true
true
true
2023-10-16T12:48:18Z
false
67.320819
87.333201
69.832089
44.923494
83.741121
54.056103
true
meta-llama_Llama-2-7b-chat-hf_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-7b-chat-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-7b-chat-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_meta-llama__Llama-2-7b-chat-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-2-7b-chat-hf
b7701a9e825e79a5ab18b5801be113c2160cc627
50.739774
llama2
3,633
6
true
true
true
true
2023-10-16T12:48:18Z
false
52.901024
78.55009
48.318826
45.57037
71.744278
7.354056
true
meta-llama_Llama-2-7b-hf_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-7b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-7b-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_meta-llama__Llama-2-7b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-2-7b-hf
e8f058fa738b6b308540024e9aa12e274e291f75
50.96642
llama2
1,523
6
true
true
true
true
2023-09-09T10:52:17Z
false
53.071672
78.589922
46.866075
38.757032
74.033149
14.480667
true
meta-llama_Llama-2-7b-hf_4bit
4bit
?
Original
Unknown
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-7b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-7b-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_meta-llama__Llama-2-7b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-2-7b-hf
6fdf2e60f86ff2481f2241aaee459f85b5b0bbb9
48.926658
llama2
1,523
0
true
true
true
true
false
53.071672
77.743477
43.796096
38.980203
74.585635
5.382866
true
meta-llama_Meta-Llama-3-70B_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_meta-llama__Meta-Llama-3-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Meta-Llama-3-70B
a897071033c208db3c2fdba8a1f4b1b3e2fbb283
73.957198
llama3
738
70
true
true
true
true
2024-04-18T17:05:50Z
false
68.771331
87.980482
79.232934
45.562368
85.319653
76.876422
true
meta-llama_Meta-Llama-3-70B-Instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3-70B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3-70B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_meta-llama__Meta-Llama-3-70B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Meta-Llama-3-70B-Instruct
5fcb2901844dde3111159f24205b71c25900ffbd
77.882051
llama3
1,203
70
true
true
true
true
2024-04-18T17:05:16Z
false
71.416382
85.690102
80.059223
61.81015
82.872928
85.443518
true
meta-llama_Meta-Llama-3-8B_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_meta-llama__Meta-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Meta-Llama-3-8B
b6887ce03ea47d068bf8502ba6ed27f8c5c12a6b
62.623813
llama3
4,883
8
true
true
true
true
2024-05-28T10:45:04Z
false
60.238908
82.234615
66.704804
42.925766
78.453039
45.185747
true
meta-llama_Meta-Llama-3-8B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_meta-llama__Meta-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Meta-Llama-3-8B
b6887ce03ea47d068bf8502ba6ed27f8c5c12a6b
62.354406
llama3
4,883
8
true
true
true
true
2024-04-18T16:47:17Z
false
59.215017
82.015535
66.494955
43.952265
77.111287
45.337377
true
meta-llama_Meta-Llama-3-8B-Instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_meta-llama__Meta-Llama-3-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Meta-Llama-3-8B-Instruct
d3aa29f914761e8ea0298051fbaf8dd173e94db5
66.869662
llama3
2,756
8
true
true
true
true
2024-04-18T17:04:58Z
false
60.750853
78.55009
67.0722
51.649723
74.506709
68.6884
true
meta-math_MetaMath-13B-V1.0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-math/MetaMath-13B-V1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-math/MetaMath-13B-V1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_meta-math__MetaMath-13B-V1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-math/MetaMath-13B-V1.0
0b448f6f64808f8bca94dc871e96a3eae7e95621
52.707811
llama2
11
13
true
true
true
true
2023-10-16T12:48:18Z
false
49.488055
76.478789
47.743624
41.57534
72.454617
28.506444
true
meta-math_MetaMath-70B-V1.0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-math/MetaMath-70B-V1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-math/MetaMath-70B-V1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_meta-math__MetaMath-70B-V1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-math/MetaMath-70B-V1.0
783a3c7d5d0a75e6e11074f2577b90dd219ef7b1
67.018692
llama2
15
70
true
true
true
true
2023-10-16T12:54:17Z
false
68.003413
86.84525
69.308312
50.97969
82.320442
44.655042
true
meta-math_MetaMath-Llemma-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-math/MetaMath-Llemma-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-math/MetaMath-Llemma-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_meta-math__MetaMath-Llemma-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-math/MetaMath-Llemma-7B
e31ec61dccd8fa24f44f0592a518491ef76a2235
53.193852
apache-2.0
12
7
true
true
true
true
2023-12-10T08:40:57Z
false
46.501706
61.690898
47.658575
39.610018
62.746646
60.955269
true
meta-math_MetaMath-Mistral-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/meta-math/MetaMath-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-math/MetaMath-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_meta-math__MetaMath-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-math/MetaMath-Mistral-7B
016a7bb03bfcd953860357e1a16d5b333b887d26
65.783136
apache-2.0
89
7
true
true
true
true
2023-12-03T15:31:15Z
false
60.665529
82.583151
61.95005
44.890521
75.769534
68.84003
true
microsoft_CodeGPT-small-py_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/microsoft/CodeGPT-small-py" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/CodeGPT-small-py</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__CodeGPT-small-py" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/CodeGPT-small-py
e5f31df92bfb7b7a808ea8d1c7557488e1bdff7f
29.167325
23
0
true
true
true
true
2023-10-16T12:46:18Z
false
22.696246
27.255527
25.04909
51.226452
48.776638
0
true
microsoft_DialoGPT-large_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/microsoft/DialoGPT-large" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/DialoGPT-large</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__DialoGPT-large" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/DialoGPT-large
04e3e47b52dadbcf7688aa61a7ed0438ecf9184c
29.274005
mit
255
0
true
true
true
true
2023-10-16T12:54:17Z
false
23.37884
25.771759
23.813663
50.272508
52.407261
0
true
microsoft_DialoGPT-medium_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/microsoft/DialoGPT-medium" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/DialoGPT-medium</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__DialoGPT-medium" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/DialoGPT-medium
9d5c5fadcc072b693fb5a5e29416bbf3f503c26c
28.861184
mit
302
0
true
true
true
true
2023-09-09T10:52:17Z
false
24.488055
26.209918
25.838967
47.064262
49.565904
0
true
microsoft_DialoGPT-small_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/microsoft/DialoGPT-small" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/DialoGPT-small</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__DialoGPT-small" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/DialoGPT-small
97d0fec744c2cb4d48f5db51d17e3258e185858e
29.188953
mit
86
0
true
true
true
true
2023-09-09T10:52:17Z
false
25.767918
25.791675
25.812743
47.485141
50.276243
0
true
microsoft_Orca-2-13b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/Orca-2-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Orca-2-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__Orca-2-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/Orca-2-13b
2539ff53e6baa4cc603774ad5a2d646f4041ea4e
58.644618
other
657
13
true
true
false
true
2023-11-21T20:09:03Z
false
60.665529
79.814778
60.370699
56.410817
76.637727
17.968158
true
microsoft_Orca-2-13b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/Orca-2-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Orca-2-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__Orca-2-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/Orca-2-13b
2539ff53e6baa4cc603774ad5a2d646f4041ea4e
61.981357
other
657
13
true
true
false
true
2023-12-29T22:36:21Z
false
60.921502
79.854611
60.301159
56.420382
76.5588
37.831691
true
microsoft_Orca-2-7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/Orca-2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Orca-2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__Orca-2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/Orca-2-7b
60e31e6bdcf582ad103b807cb74b73ee1d2c4b17
54.548688
other
207
7
true
true
false
true
2023-11-21T18:34:51Z
false
54.095563
76.190002
56.371153
52.446632
73.480663
14.708112
true
microsoft_Phi-3-medium-128k-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/Phi-3-medium-128k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Phi-3-medium-128k-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__Phi-3-medium-128k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/Phi-3-medium-128k-instruct
1e10cf49da9eceb263824a4e4646d0ecba4f7dec
72.998268
mit
299
13
true
true
false
true
2024-05-28T18:55:02Z
false
66.467577
84.853615
76.678585
54.515803
74.427782
81.046247
true
microsoft_Phi-3-medium-128k-instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/Phi-3-medium-128k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Phi-3-medium-128k-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__Phi-3-medium-128k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/Phi-3-medium-128k-instruct
cae1d42b5577398fd1be9f0746052562ae552886
72.997064
mit
299
13
true
true
false
true
2024-05-31T06:14:27Z
false
66.467577
84.913364
76.748218
54.594196
74.743489
80.515542
true
microsoft_Phi-3-medium-4k-instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/Phi-3-medium-4k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Phi-3-medium-4k-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__Phi-3-medium-4k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/Phi-3-medium-4k-instruct
d27c49ed6abea9167240288dceb4ab6bca855293
73.448553
mit
136
13
true
true
false
true
2024-05-27T15:40:50Z
false
67.320819
85.759809
77.830911
57.710066
72.691397
79.378317
true
microsoft_Phi-3-mini-128k-instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Unknown
<a target="_blank" href="https://huggingface.co/microsoft/Phi-3-mini-128k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Phi-3-mini-128k-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__Phi-3-mini-128k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/Phi-3-mini-128k-instruct
ebee18c488086b396dde649f2aa6548b9b8d2404
68.070922
mit
1,355
3
true
true
false
true
2024-04-25T13:14:03Z
false
63.139932
80.093607
68.703993
54.116386
72.84925
69.522365
true
microsoft_Phi-3-mini-4k-instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Unknown
<a target="_blank" href="https://huggingface.co/microsoft/Phi-3-mini-4k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Phi-3-mini-4k-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__Phi-3-mini-4k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/Phi-3-mini-4k-instruct
b86bcaf57ea4dfdec5dbe12a377028b2fab0d480
69.905171
mit
724
3
true
true
false
true
2024-04-26T08:51:19Z
false
62.969283
80.601474
69.083014
59.875408
72.375691
74.526156
true
microsoft_WizardLM-2-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/WizardLM-2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/WizardLM-2-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__WizardLM-2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/WizardLM-2-7B
7a41d35cebdd0388d8dfb26db2ad253f106354b3
63.660212
0
7
true
true
true
true
2024-04-15T16:31:26Z
false
62.883959
83.260307
61.533749
56.978409
73.55959
43.745262
true
microsoft_phi-1_5_bfloat16
bfloat16
🟢 pretrained
🟢
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/phi-1_5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/phi-1_5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__phi-1_5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/phi-1_5
ea95720a352172db6fcbcd89032bfb1cb8481797
47.686834
mit
1,291
1
true
true
true
true
2023-11-06T10:31:15Z
false
52.901024
63.792073
43.886468
40.889939
72.217837
12.433662
true
microsoft_phi-2_float16
float16
?
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/phi-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/phi-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__phi-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/phi-2
d3186761bf5c4409f7679359284066c25ab668ee
61.325084
mit
3,191
2
true
true
true
true
2024-06-09T15:02:44Z
false
61.09215
75.114519
58.112328
44.468396
74.348856
54.814253
true
microsoft_phi-2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/phi-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/phi-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__phi-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/phi-2
b10c3eba545ad279e7208ee3a5d644566f001670
61.087645
mit
3,191
2
true
true
true
true
2024-04-12T12:27:45Z
false
61.006826
74.915356
57.920267
44.236878
73.480663
54.965883
true
microsoft_rho-math-1b-v0.1_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/rho-math-1b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/rho-math-1b-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__rho-math-1b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/rho-math-1b-v0.1
a183246847fc6ad1014a0275006d7fc672b90187
34.985405
mit
11
1
true
true
true
true
2024-04-12T16:20:42Z
false
34.300341
53.335989
27.052346
35.47632
59.747435
0
true
migtissera_Llama-3-8B-Synthia-v3.5_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Llama-3-8B-Synthia-v3.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Llama-3-8B-Synthia-v3.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Llama-3-8B-Synthia-v3.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Llama-3-8B-Synthia-v3.5
cb00aa93e9c2d45951b76383e736b5e61e8bc86f
67.15041
llama3
14
8
true
true
true
true
2024-05-17T03:17:10Z
false
61.518771
80.940052
65.02308
54.59666
77.821626
63.002274
false
migtissera_SynthIA-70B-v1.5_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/SynthIA-70B-v1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/SynthIA-70B-v1.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__SynthIA-70B-v1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/SynthIA-70B-v1.5
40773af947d39495841d825337fdbc7ca977ef1f
70.23053
llama2
42
70
true
true
true
true
2024-01-07T21:29:28Z
false
69.368601
86.974706
69.162165
57.401258
83.662194
54.814253
false
migtissera_SynthIA-7B-v1.3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/SynthIA-7B-v1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/SynthIA-7B-v1.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__SynthIA-7B-v1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/SynthIA-7B-v1.3
8e6d0b18be876e0ebfff47d6c4f33d776f189971
59.336474
apache-2.0
142
7
true
true
true
true
2023-10-16T13:00:29Z
false
62.116041
83.449512
62.647456
51.369079
78.847672
17.589083
false
migtissera_SynthIA-7B-v1.5_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/SynthIA-7B-v1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/SynthIA-7B-v1.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__SynthIA-7B-v1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/SynthIA-7B-v1.5
5a9912ef90a0efc1aaea327e5cf3e9554c8bd897
59.593896
apache-2.0
4
7
true
true
true
true
2023-10-14T05:37:38Z
false
62.713311
83.369847
63.480494
51.31997
79.242305
17.437453
false
migtissera_Synthia-13B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Synthia-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Synthia-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Synthia-13B
fbb23bc41438b016f1df1e9180c6c350a03557ea
55.406432
llama2
11
13
true
true
true
true
2023-09-09T10:52:17Z
false
59.982935
81.856204
56.108606
47.412428
76.085241
10.993177
false
migtissera_Synthia-13B-v1.2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Synthia-13B-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Synthia-13B-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-13B-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Synthia-13B-v1.2
60d4937ac3c4dcb84c40bbf7265c5cc7f5f3d4f9
55.902362
llama2
9
13
true
true
true
true
2023-11-06T10:31:15Z
false
61.262799
82.931687
56.47472
47.271914
76.479874
10.993177
false
migtissera_Synthia-70B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Synthia-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Synthia-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Synthia-70B
d63dfdd0baed756981f5f78f7419fd822c572362
66.719226
llama2
11
70
true
true
true
true
2023-10-16T12:46:18Z
false
69.453925
87.114121
68.909224
59.788478
83.662194
31.387415
false
migtissera_Synthia-70B-v1.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Synthia-70B-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Synthia-70B-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-70B-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Synthia-70B-v1.1
05a13f6adfe95a713dff04dc2eaa214c77c2512a
66.809065
llama2
7
70
true
true
true
true
2023-10-16T12:48:18Z
false
70.051195
87.124079
70.337704
57.836916
83.662194
31.842305
false
migtissera_Synthia-70B-v1.2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Synthia-70B-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Synthia-70B-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-70B-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Synthia-70B-v1.2
9b92ee1093b125035ba1649dca6f4ceb9d86a656
66.902967
llama2
15
70
true
true
true
true
2023-09-09T10:52:17Z
false
70.477816
86.984664
70.134482
58.635157
83.267561
31.91812
false
migtissera_Synthia-70B-v1.2b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Synthia-70B-v1.2b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Synthia-70B-v1.2b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-70B-v1.2b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Synthia-70B-v1.2b
7b687d6e4101b8bb8cc4062f8a318d639098a55d
67.000081
llama2
28
70
true
true
true
true
2023-10-16T12:48:18Z
false
68.771331
87.572197
68.813176
57.690827
83.898974
35.25398
false
migtissera_Synthia-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Synthia-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Synthia-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Synthia-7B
4f9e95665d95b4c692910190ff77257216e476f1
51.831928
llama2
4
7
true
true
true
true
2023-09-09T10:52:17Z
false
56.143345
78.599881
50.349818
45.032689
74.269929
6.595906
false
migtissera_Synthia-7B-v1.2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Synthia-7B-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Synthia-7B-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-7B-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Synthia-7B-v1.2
85ea4f4818478084eedd01e958ac5cc7cf64b3bb
52.714755
llama2
10
7
true
true
true
true
2023-10-16T12:48:18Z
false
54.351536
79.286995
49.332564
48.916296
73.55959
10.841547
false
migtissera_Synthia-7B-v3.0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Synthia-7B-v3.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Synthia-7B-v3.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-7B-v3.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Synthia-7B-v3.0
93c2e8b8055b42779f2b68059ebe38af6f2789c4
61.987537
apache-2.0
21
7
true
true
true
true
2023-12-18T05:08:21Z
false
62.457338
83.78809
63.903879
43.845039
77.900552
40.030326
false
migtissera_Synthia-v3.0-11B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Synthia-v3.0-11B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Synthia-v3.0-11B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-v3.0-11B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Synthia-v3.0-11B
236b393ae07c1d80004eeda47ee017a71a899853
67.352548
apache-2.0
13
11
true
true
true
true
2023-12-27T20:24:04Z
false
64.078498
85.321649
66.175508
48.221845
84.21468
56.103108
false
migtissera_Tess-10.7B-v1.5_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Tess-10.7B-v1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Tess-10.7B-v1.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-10.7B-v1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Tess-10.7B-v1.5
634a8454c84e415721e7cab1373e0fe8daf0e944
66.551671
apache-2.0
0
10
true
true
true
true
2024-01-27T06:50:12Z
false
65.017065
84.066919
65.090112
47.430081
83.346488
54.359363
false
migtissera_Tess-10.7B-v1.5b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Tess-10.7B-v1.5b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Tess-10.7B-v1.5b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-10.7B-v1.5b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Tess-10.7B-v1.5b
c6659f344448dc66044df9b5b3e223419b0bcfbd
67.213661
apache-2.0
13
10
true
true
true
true
2024-01-28T18:00:45Z
false
65.358362
85.331607
66.238657
47.380416
82.794002
56.178923
false
migtissera_Tess-2.0-Llama-3-70B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Tess-2.0-Llama-3-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Tess-2.0-Llama-3-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-2.0-Llama-3-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Tess-2.0-Llama-3-70B
448cffd9d1334211f50fdb513272da7b77d41a8a
75.596076
llama3
11
70
true
true
false
true
2024-05-04T17:50:14Z
false
70.307167
88.458474
79.752084
54.742471
85.714286
74.601971
false
migtissera_Tess-2.0-Llama-3-70B-v0.2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Tess-2.0-Llama-3-70B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Tess-2.0-Llama-3-70B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-2.0-Llama-3-70B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Tess-2.0-Llama-3-70B-v0.2
dc64fa63c3ee844f18f14f0179f88bd1e95a805a
74.338741
llama3
8
70
true
true
false
true
2024-05-06T02:57:13Z
false
69.539249
88.259311
79.512643
50.000213
85.635359
73.085671
false
migtissera_Tess-2.0-Llama-3-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Tess-2.0-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Tess-2.0-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-2.0-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Tess-2.0-Llama-3-8B
65583e13f80f8b026376d1898ceb2f66582a9050
64.806269
llama3
16
8
true
true
false
true
2024-05-05T01:07:10Z
false
61.68942
82.732523
65.971703
51.602283
78.926598
47.915087
false
migtissera_Tess-2.0-Mixtral_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Tess-2.0-Mixtral" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Tess-2.0-Mixtral</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-2.0-Mixtral" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Tess-2.0-Mixtral
093edcad129edf9c6fa56518209fede54c90a32b
68.572837
0
46
false
true
true
true
2024-04-01T05:51:34Z
false
68.259386
86.855208
71.34413
48.009655
80.031571
56.937074
false
migtissera_Tess-2.0-Mixtral-v0.2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Tess-2.0-Mixtral-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Tess-2.0-Mixtral-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-2.0-Mixtral-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Tess-2.0-Mixtral-v0.2
b6904731bbf0474c67cd5a8f1056a6f6d29217ea
68.923871
0
46
false
true
true
true
2024-04-09T01:48:29Z
false
68.003413
86.666003
70.925861
48.922963
80.26835
58.756634
false
migtissera_Tess-2.0-Yi-34B-200K_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Tess-2.0-Yi-34B-200K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Tess-2.0-Yi-34B-200K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-2.0-Yi-34B-200K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Tess-2.0-Yi-34B-200K
9dc20c3e0d2cae5e390980a7a2a34f289bec73b3
69.058777
other
5
34
true
true
true
true
2024-04-01T05:54:23Z
false
63.481229
81.975702
75.213895
50.847435
80.74191
62.092494
false
migtissera_Tess-34B-v1.4_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Tess-34B-v1.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Tess-34B-v1.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Tess-34B-v1.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Tess-34B-v1.4
173d834656c3965cbaa49be6aab0772c3ce57821
70.10878
other
15
34
true
true
true
true
2023-12-05T19:04:55Z
false
64.590444
83.369847
75.019064
56.791444
81.21547
59.666414
false