eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
64 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.09k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.22
0.83
BBH
float64
0.25
76.7
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.7
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
525 values
Submission Date
stringclasses
263 values
Generation
int64
0
10
Base Model
stringlengths
4
102
meditsolutions_Llama-3.2-SUN-2.5B-chat_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meditsolutions/Llama-3.2-SUN-2.5B-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meditsolutions/Llama-3.2-SUN-2.5B-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meditsolutions__Llama-3.2-SUN-2.5B-chat-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meditsolutions/Llama-3.2-SUN-2.5B-chat
2bd68a18c0f7984f430acbc2efad76344177aba0
13.98771
llama3.2
2
2.472
true
false
false
true
2.913906
0.560414
56.041415
0.357473
9.409093
0.070997
7.099698
0.259228
1.230425
0.315521
1.106771
0.18135
9.038859
false
false
2024-09-27
2024-10-26
1
meditsolutions/Llama-3.2-SUN-2.5B-chat (Merge)
meditsolutions_Llama-3.2-SUN-HDIC-1B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meditsolutions/Llama-3.2-SUN-HDIC-1B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meditsolutions/Llama-3.2-SUN-HDIC-1B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meditsolutions__Llama-3.2-SUN-HDIC-1B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meditsolutions/Llama-3.2-SUN-HDIC-1B-Instruct
44d22fc1c0a85f880e75397b7fd3d0c6c1408f57
15.901864
llama3.2
1
1.498
true
false
false
true
0.70799
0.682663
68.266311
0.350773
9.529082
0.061934
6.193353
0.236577
0
0.359365
3.78724
0.168717
7.635195
false
false
2024-12-11
2024-12-11
1
meditsolutions/Llama-3.2-SUN-HDIC-1B-Instruct (Merge)
meditsolutions_MSH-Lite-7B-v1-Bielik-v2.3-Instruct-Llama-Prune_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/meditsolutions/MSH-Lite-7B-v1-Bielik-v2.3-Instruct-Llama-Prune" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meditsolutions/MSH-Lite-7B-v1-Bielik-v2.3-Instruct-Llama-Prune</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meditsolutions__MSH-Lite-7B-v1-Bielik-v2.3-Instruct-Llama-Prune-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meditsolutions/MSH-Lite-7B-v1-Bielik-v2.3-Instruct-Llama-Prune
a0ffd0cd00cab2245c1f0edcef4d1d8ead4c6d6e
14.528175
apache-2.0
0
7.646
true
false
false
true
2.1182
0.3655
36.550021
0.403485
16.138426
0.026435
2.643505
0.302852
7.04698
0.425344
11.567969
0.218999
13.222148
false
false
2024-11-12
2024-11-13
1
meditsolutions/MSH-Lite-7B-v1-Bielik-v2.3-Instruct-Llama-Prune (Merge)
meditsolutions_MSH-v1-Bielik-v2.3-Instruct-MedIT-merge_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/meditsolutions/MSH-v1-Bielik-v2.3-Instruct-MedIT-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meditsolutions/MSH-v1-Bielik-v2.3-Instruct-MedIT-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meditsolutions__MSH-v1-Bielik-v2.3-Instruct-MedIT-merge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meditsolutions/MSH-v1-Bielik-v2.3-Instruct-MedIT-merge
2db5e8871fb3be7e658e3bc6e2885d26b891b8b8
28.550706
apache-2.0
1
11.169
true
false
false
true
1.685482
0.581422
58.142174
0.567172
38.023435
0.207704
20.770393
0.345638
12.751678
0.438458
13.840625
0.349983
27.775931
false
false
2024-10-29
2024-11-06
1
meditsolutions/MSH-v1-Bielik-v2.3-Instruct-MedIT-merge (Merge)
meditsolutions_MedIT-Mesh-3B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/meditsolutions/MedIT-Mesh-3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meditsolutions/MedIT-Mesh-3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meditsolutions__MedIT-Mesh-3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meditsolutions/MedIT-Mesh-3B-Instruct
469d1a58f7747c3d456b3308b5a7042df4ab49e3
28.318228
mit
1
3.821
true
false
false
true
1.060973
0.581422
58.142174
0.557552
37.547054
0.203172
20.317221
0.323826
9.8434
0.40476
10.595052
0.40118
33.464465
false
false
2024-11-01
2024-11-01
1
meditsolutions/MedIT-Mesh-3B-Instruct (Merge)
meditsolutions_SmolLM2-MedIT-Upscale-2B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meditsolutions/SmolLM2-MedIT-Upscale-2B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meditsolutions/SmolLM2-MedIT-Upscale-2B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meditsolutions__SmolLM2-MedIT-Upscale-2B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meditsolutions/SmolLM2-MedIT-Upscale-2B
5696c9ea7cbdee0f8ad1845f5a2dc7309f376143
15.922534
apache-2.0
4
2.114
true
false
false
true
0.672283
0.642921
64.292078
0.355112
10.514326
0.055891
5.589124
0.264262
1.901566
0.331365
2.453906
0.197058
10.784205
false
false
2024-12-02
2024-12-02
1
meditsolutions/SmolLM2-MedIT-Upscale-2B (Merge)
meetkai_functionary-small-v3.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meetkai/functionary-small-v3.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meetkai/functionary-small-v3.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meetkai__functionary-small-v3.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meetkai/functionary-small-v3.1
8e43bc1d2e259b91799e704c410a95b8ca458121
24.083336
mit
18
8.03
true
false
false
true
1.408739
0.627458
62.745848
0.498178
28.616315
0.1571
15.70997
0.288591
5.145414
0.383365
6.18724
0.334857
26.095228
false
false
2024-07-26
2024-11-10
0
meetkai/functionary-small-v3.1
meraGPT_mera-mix-4x7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/meraGPT/mera-mix-4x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meraGPT/mera-mix-4x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meraGPT__mera-mix-4x7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meraGPT/mera-mix-4x7B
09d965c5ef9b66ce419986027e03a915cb869e43
17.854959
apache-2.0
18
24.154
true
false
false
true
3.328801
0.483178
48.317797
0.401899
17.486439
0.053625
5.362538
0.30453
7.270694
0.405656
9.273698
0.274767
19.418587
false
false
2024-04-13
2024-06-27
0
meraGPT/mera-mix-4x7B
mergekit-community_JAJUKA-WEWILLNEVERFORGETYOU-3B_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/mergekit-community/JAJUKA-WEWILLNEVERFORGETYOU-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mergekit-community/JAJUKA-WEWILLNEVERFORGETYOU-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mergekit-community__JAJUKA-WEWILLNEVERFORGETYOU-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mergekit-community/JAJUKA-WEWILLNEVERFORGETYOU-3B
b4f2b833913f2f7b3ef009b67b47463f10c87e7d
19.545574
0
3.213
false
false
false
false
0.593982
0.494069
49.406907
0.436972
20.143746
0.124622
12.462236
0.292785
5.704698
0.365625
6.969792
0.303275
22.586067
false
false
2025-02-26
2025-02-26
1
mergekit-community/JAJUKA-WEWILLNEVERFORGETYOU-3B (Merge)
mergekit-community_SuperQwen-2.5-1.5B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/mergekit-community/SuperQwen-2.5-1.5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mergekit-community/SuperQwen-2.5-1.5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mergekit-community__SuperQwen-2.5-1.5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mergekit-community/SuperQwen-2.5-1.5B
40caabd01f4d263eaff180005128019697ce7ad4
3.259712
2
1.777
false
false
false
true
1.184873
0.133641
13.364096
0.29069
1.735105
0.019637
1.963746
0.254195
0.559284
0.335521
1.106771
0.107463
0.82927
false
false
2025-01-25
2025-01-25
1
mergekit-community/SuperQwen-2.5-1.5B (Merge)
mergekit-community_VirtuosoSmall-InstructModelStock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/mergekit-community/VirtuosoSmall-InstructModelStock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mergekit-community/VirtuosoSmall-InstructModelStock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mergekit-community__VirtuosoSmall-InstructModelStock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mergekit-community/VirtuosoSmall-InstructModelStock
4ac90913a36d0f1b7bcf6ed31561137d1f7b0aa6
38.226937
2
14.766
false
false
false
false
3.973855
0.523795
52.379464
0.65179
49.941772
0.409366
40.936556
0.38255
17.673378
0.475573
19.313281
0.542055
49.117169
false
false
2024-12-19
2024-12-19
1
mergekit-community/VirtuosoSmall-InstructModelStock (Merge)
mergekit-community_diabolic6045_ELN-AOC-CAIN_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/mergekit-community/diabolic6045_ELN-AOC-CAIN" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mergekit-community/diabolic6045_ELN-AOC-CAIN</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mergekit-community__diabolic6045_ELN-AOC-CAIN-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mergekit-community/diabolic6045_ELN-AOC-CAIN
8c58c8a7139d8002e2acabc66199aad25bb95453
4.069643
0
1.236
false
false
false
false
0.723004
0.086155
8.615474
0.312568
4.630015
0.012085
1.208459
0.263423
1.789709
0.36575
6.052083
0.119099
2.122119
false
false
2025-01-05
2025-01-05
1
mergekit-community/diabolic6045_ELN-AOC-CAIN (Merge)
mergekit-community_mergekit-dare_ties-ajgjgea_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/mergekit-community/mergekit-dare_ties-ajgjgea" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mergekit-community/mergekit-dare_ties-ajgjgea</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mergekit-community__mergekit-dare_ties-ajgjgea-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mergekit-community/mergekit-dare_ties-ajgjgea
13203710422db795468a10e1d9fe623d0759a9da
13.518727
0
1.498
false
false
false
true
0.691293
0.526342
52.634233
0.34947
9.245559
0.064199
6.41994
0.264262
1.901566
0.328917
2.647917
0.174368
8.26315
false
false
2025-01-25
2025-01-25
1
mergekit-community/mergekit-dare_ties-ajgjgea (Merge)
mergekit-community_mergekit-della-zgowfmf_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/mergekit-community/mergekit-della-zgowfmf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mergekit-community/mergekit-della-zgowfmf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mergekit-community__mergekit-della-zgowfmf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mergekit-community/mergekit-della-zgowfmf
8d99e6b381db8b64944b3dcfb05daa444206782d
37.278572
0
14.766
false
false
false
false
3.871069
0.482754
48.275354
0.659079
50.995373
0.361782
36.178248
0.390101
18.680089
0.483385
20.489844
0.541473
49.052527
false
false
2024-12-19
2024-12-19
1
mergekit-community/mergekit-della-zgowfmf (Merge)
mergekit-community_mergekit-model_stock-azgztvm_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/mergekit-community/mergekit-model_stock-azgztvm" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mergekit-community/mergekit-model_stock-azgztvm</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mergekit-community__mergekit-model_stock-azgztvm-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mergekit-community/mergekit-model_stock-azgztvm
7f63ea96f89147daf909251cd3c1f1a20e005559
38.374258
0
14.766
false
false
false
false
4.067798
0.506159
50.615921
0.654278
50.294377
0.437311
43.731118
0.381711
17.561521
0.473
19.091667
0.540559
48.950946
false
false
2024-12-20
2024-12-20
1
mergekit-community/mergekit-model_stock-azgztvm (Merge)
mergekit-community_mergekit-slerp-fmrazcr_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/mergekit-community/mergekit-slerp-fmrazcr" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mergekit-community/mergekit-slerp-fmrazcr</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mergekit-community__mergekit-slerp-fmrazcr-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mergekit-community/mergekit-slerp-fmrazcr
87305622616c521e66fd48c48fc9b6eeb6287ff8
22.69749
0
8.03
false
false
false
false
1.460969
0.417432
41.743241
0.534162
33.650929
0.119335
11.933535
0.311242
8.165548
0.410458
9.840625
0.37766
30.851064
false
false
2024-12-26
2024-12-26
1
mergekit-community/mergekit-slerp-fmrazcr (Merge)
mergekit-community_mergekit-ties-rraxdhv_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/mergekit-community/mergekit-ties-rraxdhv" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mergekit-community/mergekit-ties-rraxdhv</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mergekit-community__mergekit-ties-rraxdhv-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mergekit-community/mergekit-ties-rraxdhv
27862d9e4e53426ab3274b316b5af3381c562e6d
16.323049
0
9.242
false
false
false
true
3.899639
0.112308
11.230757
0.518359
31.666385
0.04003
4.003021
0.307886
7.718121
0.420198
10.991406
0.390957
32.328605
false
false
2025-01-26
2025-01-26
1
mergekit-community/mergekit-ties-rraxdhv (Merge)
mergekit-community_mergekit-ties-ykqemwr_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mergekit-community/mergekit-ties-ykqemwr" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mergekit-community/mergekit-ties-ykqemwr</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mergekit-community__mergekit-ties-ykqemwr-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mergekit-community/mergekit-ties-ykqemwr
81ba78a711ee017c3174f0b3cbf2135ec5b45d3d
22.441351
0
12.248
false
false
false
false
2.067852
0.359955
35.995492
0.54555
34.709886
0.122356
12.23565
0.322148
9.619687
0.419792
11.707292
0.373421
30.380098
false
false
2024-12-25
2024-12-25
1
mergekit-community/mergekit-ties-ykqemwr (Merge)
mergekit-community_sexeh_time_testing_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/mergekit-community/sexeh_time_testing" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mergekit-community/sexeh_time_testing</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mergekit-community__sexeh_time_testing-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mergekit-community/sexeh_time_testing
78a3c6eb8ee3d92f3c3669d91e3869ed5ca20a5c
25.631345
1
8.03
false
false
false
true
1.085065
0.732946
73.294636
0.524132
32.487494
0.089879
8.987915
0.291107
5.480984
0.361906
3.904948
0.366689
29.632092
false
false
2024-12-27
2024-12-27
1
mergekit-community/sexeh_time_testing (Merge)
meta-llama_Llama-2-13b-chat-hf_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-13b-chat-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-13b-chat-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-2-13b-chat-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-2-13b-chat-hf
a2cb7a712bb6e5e736ca7f8cd98167f81a0b5bd8
11.129635
llama2
1,067
13.016
true
false
false
true
1.749139
0.398473
39.847272
0.334274
7.15538
0.013595
1.359517
0.231544
0
0.400729
8.157813
0.19232
10.257831
false
true
2023-07-13
2024-06-12
0
meta-llama/Llama-2-13b-chat-hf
meta-llama_Llama-2-13b-hf_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-13b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-13b-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-2-13b-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-2-13b-hf
5c31dfb671ce7cfe2d7bb7c04375e44c55e815b1
11.065186
llama2
594
13.016
true
false
false
false
2.22476
0.248247
24.824687
0.412562
17.22256
0.015106
1.510574
0.28104
4.138702
0.35375
3.385417
0.237783
15.309176
false
true
2023-07-13
2024-06-12
0
meta-llama/Llama-2-13b-hf
meta-llama_Llama-2-70b-chat-hf_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-70b-chat-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-70b-chat-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-2-70b-chat-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-2-70b-chat-hf
e9149a12809580e8602995856f8098ce973d1080
13.073696
llama2
2,182
68.977
true
false
false
true
45.79691
0.495792
49.579228
0.304247
4.613767
0.029456
2.945619
0.264262
1.901566
0.368667
3.483333
0.243268
15.918661
false
true
2023-07-14
2024-06-12
0
meta-llama/Llama-2-70b-chat-hf
meta-llama_Llama-2-70b-hf_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-70b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-70b-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-2-70b-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-2-70b-hf
3aba440b59558f995867ba6e1f58f21d0336b5bb
18.372599
llama2
846
68.977
true
false
false
false
59.242493
0.240678
24.067807
0.547259
35.900062
0.032477
3.247734
0.302852
7.04698
0.412354
9.777604
0.371759
30.195405
false
true
2023-07-11
2024-06-12
0
meta-llama/Llama-2-70b-hf
meta-llama_Llama-2-7b-chat-hf_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-7b-chat-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-7b-chat-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-2-7b-chat-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-2-7b-chat-hf
f5db02db724555f92da89c216ac04704f23d4590
9.609483
llama2
4,318
6.738
true
false
false
true
1.791393
0.398648
39.864781
0.311355
4.459172
0.019637
1.963746
0.253356
0.447427
0.367552
3.277344
0.1688
7.64443
false
true
2023-07-13
2024-08-30
0
meta-llama/Llama-2-7b-chat-hf
meta-llama_Llama-2-7b-hf_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-2-7b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-2-7b-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-2-7b-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-2-7b-hf
01c7f73d771dfac7d292323805ebc428287df4f9
8.806358
llama2
1,985
6.738
true
false
false
false
1.126189
0.251894
25.189386
0.34962
10.351417
0.017372
1.73716
0.266779
2.237136
0.370062
3.757813
0.186087
9.565233
false
true
2023-07-13
2024-06-12
0
meta-llama/Llama-2-7b-hf
meta-llama_Llama-3.1-70B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.1-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.1-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-3.1-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-3.1-70B
f7d3cc45ed4ff669a354baf2e0f05e65799a0bee
26.200216
llama3.1
350
70.554
true
false
false
true
13.601852
0.168438
16.843752
0.626007
46.399413
0.18429
18.429003
0.387584
18.344519
0.457188
16.581771
0.465426
40.602837
false
true
2024-07-14
2024-07-23
0
meta-llama/Llama-3.1-70B
meta-llama_Llama-3.1-70B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.1-70B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.1-70B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-3.1-70B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-3.1-70B-Instruct
b9461463b511ed3c0762467538ea32cf7c9669f2
43.409948
llama3.1
797
70.554
true
false
false
true
40.221824
0.866885
86.688542
0.691729
55.927992
0.380665
38.066465
0.356544
14.205817
0.458063
17.691146
0.530918
47.879728
false
true
2024-07-16
2024-08-15
1
meta-llama/Meta-Llama-3.1-70B
meta-llama_Llama-3.1-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-3.1-8B
d04e592bb4f6aa9cfee91e2e20afa771667e1d4b
14.420865
llama3.1
1,503
8.03
true
false
false
false
1.426487
0.124598
12.459829
0.465959
25.304471
0.06571
6.570997
0.310403
8.053691
0.381188
8.715104
0.32879
25.421099
false
true
2024-07-14
2024-12-07
0
meta-llama/Llama-3.1-8B
meta-llama_Llama-3.1-8B-Instruct_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.1-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.1-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-3.1-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-3.1-8B-Instruct
0e9e39f249a16976918f6564b8830bc894c89659
23.763729
llama3.1
3,759
8.03
true
false
false
false
2.106037
0.492171
49.217077
0.508703
29.379192
0.155589
15.558912
0.315436
8.724832
0.397156
8.611198
0.37982
31.091164
false
true
2024-07-18
2025-02-06
1
meta-llama/Meta-Llama-3.1-8B
meta-llama_Llama-3.2-1B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.2-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.2-1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-3.2-1B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-3.2-1B
a7c18587d7f473bfea02aa5639aa349403307b54
4.19514
llama3.2
1,700
1.24
true
false
false
false
0.838257
0.147779
14.7779
0.311495
4.36603
0.012085
1.208459
0.228188
0
0.344729
2.557813
0.120346
2.260638
false
true
2024-09-18
2024-09-23
0
meta-llama/Llama-3.2-1B
meta-llama_Llama-3.2-1B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.2-1B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-3.2-1B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-3.2-1B-Instruct
d0a2081ed47e20ce524e8bc5d132f3fad2f69ff0
14.443126
llama3.2
831
1.24
true
false
false
true
0.809809
0.569831
56.983138
0.349685
8.742521
0.070242
7.024169
0.275168
3.355705
0.332854
2.973438
0.168218
7.579787
false
true
2024-09-18
2024-09-23
0
meta-llama/Llama-3.2-1B-Instruct
meta-llama_Llama-3.2-3B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.2-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.2-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-3.2-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-3.2-3B
95c102307f55fbd6d18ddf28bfbcb537ffdc2806
8.697823
llama3.2
529
3.213
true
false
false
false
2.013735
0.133741
13.37407
0.390512
14.232665
0.018882
1.888218
0.267617
2.348993
0.357719
3.814844
0.248753
16.528147
false
true
2024-09-18
2024-09-27
0
meta-llama/Llama-3.2-3B
meta-llama_Llama-3.2-3B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.2-3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.2-3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-3.2-3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-3.2-3B-Instruct
276b29ce8303c9b88966a9b32fc75692dce4d8e1
24.204651
llama3.2
1,245
3.213
true
false
false
true
1.927962
0.739316
73.931613
0.461007
24.059186
0.176737
17.673716
0.278523
3.803132
0.352854
1.373437
0.319481
24.38682
false
true
2024-09-18
2024-09-27
0
meta-llama/Llama-3.2-3B-Instruct
meta-llama_Llama-3.3-70B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Llama-3.3-70B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Llama-3.3-70B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Llama-3.3-70B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Llama-3.3-70B-Instruct
44.847471
llama3.3
2,169
70.554
true
false
false
true
76.559074
0.899758
89.97582
0.691931
56.561411
0.483384
48.338369
0.328859
10.514541
0.446125
15.565625
0.533162
48.129063
false
true
2024-11-26
2024-12-03
1
meta-llama/Llama-3.3-70B-Instruct (Merge)
meta-llama_Meta-Llama-3-70B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Meta-Llama-3-70B
b4d08b7db49d488da3ac49adf25a6b9ac01ae338
26.70535
llama3
854
70.554
true
false
false
false
46.814372
0.160319
16.031906
0.646107
48.709813
0.185801
18.58006
0.397651
19.686801
0.451823
16.011198
0.470911
41.212323
false
true
2024-04-17
2024-06-12
0
meta-llama/Meta-Llama-3-70B
meta-llama_Meta-Llama-3-70B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3-70B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3-70B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3-70B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Meta-Llama-3-70B-Instruct
7129260dd854a80eb10ace5f61c20324b472b31c
36.372224
llama3
1,464
70.554
true
false
false
true
36.4783
0.809908
80.990771
0.65467
50.185133
0.244713
24.471299
0.286913
4.9217
0.415365
10.920573
0.520695
46.743868
false
true
2024-04-17
2024-06-12
1
meta-llama/Meta-Llama-3-70B
meta-llama_Meta-Llama-3-8B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Meta-Llama-3-8B
62bd457b6fe961a42a631306577e622c83876cb6
13.626857
llama3
6,093
8.03
true
false
false
false
1.745137
0.145506
14.550615
0.459791
24.500764
0.045317
4.531722
0.305369
7.38255
0.361406
6.242448
0.320977
24.553044
false
true
2024-04-17
2024-06-12
0
meta-llama/Meta-Llama-3-8B
meta-llama_Meta-Llama-3-8B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Meta-Llama-3-8B-Instruct
e1945c40cd546c78e41f1151f4db032b271faeaa
23.908736
llama3
3,873
8.03
true
false
false
true
0.7975
0.74084
74.083986
0.498871
28.24495
0.086858
8.685801
0.259228
1.230425
0.356823
1.602865
0.366439
29.604388
false
true
2024-04-17
2024-06-12
0
meta-llama/Meta-Llama-3-8B-Instruct
meta-llama_Meta-Llama-3-8B-Instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">meta-llama/Meta-Llama-3-8B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/meta-llama__Meta-Llama-3-8B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
meta-llama/Meta-Llama-3-8B-Instruct
e1945c40cd546c78e41f1151f4db032b271faeaa
20.609159
llama3
3,873
8.03
true
false
false
false
1.898947
0.478232
47.82322
0.491026
26.795284
0.09139
9.138973
0.292785
5.704698
0.380542
5.401042
0.359126
28.791741
false
true
2024-04-17
2024-07-08
0
meta-llama/Meta-Llama-3-8B-Instruct
mhl1_Qwen2.5-0.5B-cinstruct-stage1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/mhl1/Qwen2.5-0.5B-cinstruct-stage1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mhl1/Qwen2.5-0.5B-cinstruct-stage1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mhl1__Qwen2.5-0.5B-cinstruct-stage1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mhl1/Qwen2.5-0.5B-cinstruct-stage1
19d55d8d5bf1e7d98a865121862f3781a27b1b2e
4.551665
apache-2.0
0
0.63
true
false
false
true
1.764801
0.148179
14.817905
0.325578
5.724527
0.01284
1.283988
0.265101
2.013423
0.350031
1.920573
0.113946
1.549572
false
false
2025-01-08
2025-01-08
1
mhl1/Qwen2.5-0.5B-cinstruct-stage1 (Merge)
microsoft_DialoGPT-medium_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/microsoft/DialoGPT-medium" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/DialoGPT-medium</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__DialoGPT-medium-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/DialoGPT-medium
7b40bb0f92c45fefa957d088000d8648e5c7fa33
5.251434
mit
360
0.345
true
false
false
true
0.258929
0.147904
14.790423
0.301416
2.556856
0
0
0.254195
0.559284
0.428667
12.283333
0.111868
1.318706
false
true
2022-03-02
2024-06-13
0
microsoft/DialoGPT-medium
microsoft_Orca-2-13b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/Orca-2-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Orca-2-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__Orca-2-13b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/Orca-2-13b
2539ff53e6baa4cc603774ad5a2d646f4041ea4e
18.501871
other
666
13
true
false
false
false
2.017163
0.312793
31.279339
0.488449
27.308019
0.031722
3.172205
0.280201
4.026846
0.512969
25.78776
0.274934
19.437057
false
true
2023-11-14
2024-06-12
0
microsoft/Orca-2-13b
microsoft_Orca-2-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/Orca-2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Orca-2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__Orca-2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/Orca-2-7b
60e31e6bdcf582ad103b807cb74b73ee1d2c4b17
14.40483
other
218
7
true
false
false
false
1.81067
0.218346
21.834621
0.445213
22.429468
0.019637
1.963746
0.260906
1.454139
0.502615
24.09349
0.231882
14.653517
false
true
2023-11-14
2024-06-12
0
microsoft/Orca-2-7b
microsoft_Phi-3-medium-128k-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/Phi-3-medium-128k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Phi-3-medium-128k-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__Phi-3-medium-128k-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/Phi-3-medium-128k-instruct
fa7d2aa4f5ea69b2e36b20d050cdae79c9bfbb3f
32.026356
mit
381
13.96
true
false
false
true
3.895117
0.604003
60.400293
0.638232
48.460451
0.191843
19.18429
0.336409
11.521253
0.412948
11.351823
0.47116
41.240027
false
true
2024-05-07
2024-08-21
0
microsoft/Phi-3-medium-128k-instruct
microsoft_Phi-3-medium-4k-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/Phi-3-medium-4k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Phi-3-medium-4k-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__Phi-3-medium-4k-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/Phi-3-medium-4k-instruct
d194e4e74ffad5a5e193e26af25bcfc80c7f1ffc
33.097659
mit
217
13.96
true
false
false
true
2.910525
0.642271
64.22714
0.641246
49.38061
0.195619
19.561934
0.336409
11.521253
0.42575
13.052083
0.467586
40.842937
false
true
2024-05-07
2024-06-12
0
microsoft/Phi-3-medium-4k-instruct
microsoft_Phi-3-mini-128k-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/Phi-3-mini-128k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Phi-3-mini-128k-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__Phi-3-mini-128k-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/Phi-3-mini-128k-instruct
5be6479b4bc06a081e8f4c6ece294241ccd32dec
26.34381
mit
1,636
3.821
true
false
false
true
48.444503
0.597633
59.763317
0.557453
37.099767
0.140483
14.048338
0.317953
9.060403
0.393688
7.710938
0.373421
30.380098
false
true
2024-04-22
2024-08-21
0
microsoft/Phi-3-mini-128k-instruct
microsoft_Phi-3-mini-4k-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/Phi-3-mini-4k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Phi-3-mini-4k-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__Phi-3-mini-4k-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/Phi-3-mini-4k-instruct
ff07dc01615f8113924aed013115ab2abd32115b
25.967733
mit
1,154
3.821
true
false
false
true
0.804075
0.561288
56.128849
0.567597
39.269335
0.116314
11.63142
0.319631
9.284116
0.395021
7.644271
0.386636
31.848404
false
true
2024-04-22
2024-06-12
0
microsoft/Phi-3-mini-4k-instruct
microsoft_Phi-3-mini-4k-instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/Phi-3-mini-4k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Phi-3-mini-4k-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__Phi-3-mini-4k-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/Phi-3-mini-4k-instruct
c1358f8a35e6d2af81890deffbbfa575b978c62f
27.562174
mit
1,154
3.821
true
false
false
true
1.573399
0.547675
54.767461
0.549072
36.559855
0.163897
16.389728
0.332215
10.961969
0.428417
13.11875
0.402178
33.575281
false
true
2024-04-22
2024-07-02
0
microsoft/Phi-3-mini-4k-instruct
microsoft_Phi-3-small-128k-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3SmallForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/Phi-3-small-128k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Phi-3-small-128k-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__Phi-3-small-128k-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/Phi-3-small-128k-instruct
f80aaa30bfc64c2b8ab214b541d9050e97163bc4
31.967803
mit
175
7.392
true
false
false
true
3.898024
0.636826
63.682584
0.620218
45.63407
0.202609
20.26087
0.317114
8.948546
0.437844
14.497135
0.449053
38.783614
false
true
2024-05-07
2024-06-13
0
microsoft/Phi-3-small-128k-instruct
microsoft_Phi-3-small-8k-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3SmallForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/Phi-3-small-8k-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Phi-3-small-8k-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__Phi-3-small-8k-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/Phi-3-small-8k-instruct
1535ae26fb4faada95c6950e8bc6e867cdad6b00
32.342014
mit
165
7.392
true
false
false
true
2.050907
0.649665
64.966511
0.620836
46.20557
0.188696
18.869565
0.312081
8.277405
0.455792
16.773958
0.450632
38.959072
false
true
2024-05-07
2024-06-13
0
microsoft/Phi-3-small-8k-instruct
microsoft_Phi-3.5-MoE-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/Phi-3.5-MoE-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Phi-3.5-MoE-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__Phi-3.5-MoE-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/Phi-3.5-MoE-instruct
482a9ba0eb0e1fa1671e3560e009d7cec2e5147c
36.878965
mit
556
42
true
true
false
true
9.264557
0.692455
69.245491
0.640763
48.774646
0.311934
31.193353
0.355705
14.09396
0.456479
17.326562
0.465758
40.639775
false
true
2024-08-17
2024-08-21
0
microsoft/Phi-3.5-MoE-instruct
microsoft_Phi-3.5-mini-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/Phi-3.5-mini-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Phi-3.5-mini-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__Phi-3.5-mini-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/Phi-3.5-mini-instruct
64963004ad95869fa73a30279371c8778509ac84
28.184391
mit
844
3.821
true
false
false
true
7.392009
0.57745
57.745005
0.551779
36.745854
0.196375
19.637462
0.339765
11.96868
0.402125
10.098958
0.396193
32.910387
false
true
2024-08-16
2024-08-21
0
microsoft/Phi-3.5-mini-instruct
microsoft_Phi-4-mini-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/Phi-4-mini-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/Phi-4-mini-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__Phi-4-mini-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/Phi-4-mini-instruct
f984c153f9e5738b59f28190d593bd9ad40745bb
29.412434
mit
375
3.836
true
false
false
true
0.82824
0.737792
73.779239
0.568863
38.735536
0.16994
16.993958
0.309564
7.941834
0.387302
6.446094
0.393201
32.57794
false
true
2025-02-19
2025-02-28
0
microsoft/Phi-4-mini-instruct
microsoft_phi-1_bfloat16
bfloat16
🟢 pretrained
🟢
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/phi-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/phi-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__phi-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/phi-1
b9ac0e6d78d43970ecf88e9e0154b3a7da20ed89
5.574318
mit
211
1.418
true
false
false
false
0.572458
0.206806
20.680572
0.313948
4.273999
0.009819
0.981873
0.265101
2.013423
0.35251
3.697135
0.11619
1.798907
false
true
2023-09-10
2024-06-13
0
microsoft/phi-1
microsoft_phi-1_5_float16
float16
🟢 pretrained
🟢
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/phi-1_5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/phi-1_5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__phi-1_5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/phi-1_5
675aa382d814580b22651a30acb1a585d7c25963
7.170967
mit
1,328
1.418
true
false
false
false
0.681724
0.203284
20.328395
0.335976
7.468939
0.018127
1.812689
0.267617
2.348993
0.340417
3.385417
0.169132
7.681368
false
true
2023-09-10
2024-06-09
0
microsoft/phi-1_5
microsoft_phi-2_float16
float16
🟢 pretrained
🟢
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/phi-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/phi-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__phi-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/phi-2
ef382358ec9e382308935a992d908de099b64c23
15.534292
mit
3,293
2.78
true
false
false
false
0.847042
0.273876
27.387554
0.488121
28.038519
0.029456
2.945619
0.271812
2.908277
0.409896
13.836979
0.262799
18.0888
false
true
2023-12-13
2024-06-09
0
microsoft/phi-2
microsoft_phi-4_float16
float16
🟢 pretrained
🟢
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/phi-4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/phi-4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__phi-4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/phi-4
381727a5ee103da6c1b14ecd3d39cd09832cbcf8
29.483417
mit
1,919
14.66
true
false
false
false
0.878362
0.048785
4.8785
0.670346
52.575672
0.278701
27.870091
0.401007
20.134228
0.503354
23.719271
0.529505
47.722739
false
true
2024-12-11
2025-01-08
0
microsoft/phi-4
microsoft_phi-4_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/microsoft/phi-4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">microsoft/phi-4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/microsoft__phi-4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
microsoft/phi-4
381727a5ee103da6c1b14ecd3d39cd09832cbcf8
30.358128
mit
1,919
14.66
true
false
false
true
2.77316
0.058527
5.852693
0.669056
52.427848
0.316465
31.646526
0.40604
20.805369
0.503354
23.785938
0.528674
47.630393
false
true
2024-12-11
2025-01-08
0
microsoft/phi-4
migtissera_Llama-3-70B-Synthia-v3.5_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Llama-3-70B-Synthia-v3.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Llama-3-70B-Synthia-v3.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/migtissera__Llama-3-70B-Synthia-v3.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Llama-3-70B-Synthia-v3.5
8744db0bccfc18f1847633da9d29fc89b35b4190
35.569354
llama3
5
70.554
true
false
false
true
17.539396
0.60765
60.764992
0.648864
49.11816
0.21148
21.148036
0.387584
18.344519
0.492198
23.391406
0.465841
40.64901
false
false
2024-05-26
2024-08-28
0
migtissera/Llama-3-70B-Synthia-v3.5
migtissera_Llama-3-8B-Synthia-v3.5_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Llama-3-8B-Synthia-v3.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Llama-3-8B-Synthia-v3.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/migtissera__Llama-3-8B-Synthia-v3.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Llama-3-8B-Synthia-v3.5
af4990801a24fee7acf16370cb5aa5643b5e9d6c
19.94844
llama3
15
8.03
true
false
false
true
1.657397
0.506958
50.69582
0.488794
27.542339
0.06571
6.570997
0.271812
2.908277
0.404385
9.414844
0.303025
22.558363
false
false
2024-05-17
2024-08-28
0
migtissera/Llama-3-8B-Synthia-v3.5
migtissera_Tess-3-7B-SFT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Tess-3-7B-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Tess-3-7B-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/migtissera__Tess-3-7B-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Tess-3-7B-SFT
404de3b56564dbd43cd64d97f8574b43189462f3
17.209456
apache-2.0
4
7.248
true
false
false
true
1.29434
0.394626
39.462626
0.460735
24.123847
0.04003
4.003021
0.270973
2.796421
0.411271
10.275521
0.303358
22.595301
false
false
2024-07-09
2024-07-20
1
mistralai/Mistral-7B-v0.3
migtissera_Tess-3-Mistral-Nemo-12B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Tess-3-Mistral-Nemo-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Tess-3-Mistral-Nemo-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/migtissera__Tess-3-Mistral-Nemo-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Tess-3-Mistral-Nemo-12B
0b82dea6e8f4aed4a1c2e10198d68991c30d171b
16.720173
apache-2.0
12
12.248
true
false
false
true
3.779984
0.3355
33.549981
0.489942
28.042728
0.057402
5.740181
0.250839
0.111857
0.445781
15.489323
0.256483
17.386968
false
false
2024-08-13
2024-09-16
0
migtissera/Tess-3-Mistral-Nemo-12B
migtissera_Tess-v2.5-Phi-3-medium-128k-14B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Tess-v2.5-Phi-3-medium-128k-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Tess-v2.5-Phi-3-medium-128k-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/migtissera__Tess-v2.5-Phi-3-medium-128k-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Tess-v2.5-Phi-3-medium-128k-14B
3a4dbce32e765f659d418c57f0040d290b8b480d
24.141201
mit
4
13.96
true
false
false
true
4.47634
0.453877
45.387682
0.620661
46.215828
0.050604
5.060423
0.307886
7.718121
0.411302
10.11276
0.373172
30.352394
false
false
2024-06-05
2024-08-30
1
microsoft/Phi-3-medium-128k-instruct
migtissera_Tess-v2.5.2-Qwen2-72B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Tess-v2.5.2-Qwen2-72B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Tess-v2.5.2-Qwen2-72B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/migtissera__Tess-v2.5.2-Qwen2-72B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Tess-v2.5.2-Qwen2-72B
0435e634ad9bc8b1172395a535b78e6f25f3594f
33.603338
other
11
72
true
false
false
true
29.226175
0.449431
44.943084
0.664679
52.308136
0.293807
29.380665
0.350671
13.422819
0.418833
10.8875
0.5561
50.677822
false
false
2024-06-13
2024-08-10
0
migtissera/Tess-v2.5.2-Qwen2-72B
migtissera_Trinity-2-Codestral-22B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Trinity-2-Codestral-22B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Trinity-2-Codestral-22B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/migtissera__Trinity-2-Codestral-22B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Trinity-2-Codestral-22B
5f20b9d8af1a75c135c70bd7295e58301cce63fc
21.995244
other
12
22.247
true
false
false
true
3.004315
0.420205
42.020506
0.559324
36.412738
0.096677
9.667674
0.314597
8.612975
0.411052
9.614844
0.330785
25.64273
false
false
2024-08-07
2024-09-16
1
mistralai/Codestral-22B-v0.1
migtissera_Trinity-2-Codestral-22B-v0.2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Trinity-2-Codestral-22B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Trinity-2-Codestral-22B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/migtissera__Trinity-2-Codestral-22B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Trinity-2-Codestral-22B-v0.2
63513c3eb9b7c552fc163f58a2e7dc1fa09573b5
21.869825
other
7
22.247
true
false
false
true
1.553522
0.434468
43.446832
0.568636
37.614246
0.083837
8.383686
0.300336
6.711409
0.404479
9.059896
0.334026
26.002881
false
false
2024-08-13
2024-08-28
1
mistralai/Codestral-22B-v0.1
migtissera_Trinity-2-Codestral-22B-v0.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/migtissera/Trinity-2-Codestral-22B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">migtissera/Trinity-2-Codestral-22B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/migtissera__Trinity-2-Codestral-22B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
migtissera/Trinity-2-Codestral-22B-v0.2
9452a82ac7bfa9092a061ec913e9078ef3525a03
22.250269
other
7
22.247
true
false
false
true
3.122415
0.443011
44.301121
0.570647
37.786041
0.086858
8.685801
0.307886
7.718121
0.403146
8.859896
0.335356
26.150635
false
false
2024-08-13
2024-09-16
1
mistralai/Codestral-22B-v0.1
mindw96_DeepSeek-llama3.3-Bllossom-8B-DACON-LLM3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/mindw96/DeepSeek-llama3.3-Bllossom-8B-DACON-LLM3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mindw96/DeepSeek-llama3.3-Bllossom-8B-DACON-LLM3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mindw96__DeepSeek-llama3.3-Bllossom-8B-DACON-LLM3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mindw96/DeepSeek-llama3.3-Bllossom-8B-DACON-LLM3
03cffd19ee78646543e020e2ebc9d553a4c5242b
4.009061
0
8.03
false
false
false
false
1.553127
0.138812
13.881169
0.306754
3.315965
0.008308
0.830816
0.250839
0.111857
0.379208
4.734375
0.110622
1.180186
false
false
2025-02-17
0
Removed
minghaowu_Qwen1.5-1.8B-OpenHermes-2.5_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/minghaowu/Qwen1.5-1.8B-OpenHermes-2.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">minghaowu/Qwen1.5-1.8B-OpenHermes-2.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/minghaowu__Qwen1.5-1.8B-OpenHermes-2.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
minghaowu/Qwen1.5-1.8B-OpenHermes-2.5
40700de82968350c192318877fe522630d0ef76d
8.684751
0
1.837
false
false
false
true
2.189801
0.277797
27.779736
0.337464
7.561478
0.024169
2.416918
0.283557
4.474273
0.352885
1.077344
0.179189
8.798759
false
false
2024-09-12
0
Removed
ministral_Ministral-3b-instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ministral/Ministral-3b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ministral/Ministral-3b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ministral__Ministral-3b-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ministral/Ministral-3b-instruct
2c95908929198d6e69af8638f0dbbd9bc6b93f9e
3.520083
apache-2.0
48
3.316
true
false
false
false
0.528974
0.135764
13.576422
0.319186
4.675864
0.008308
0.830816
0.251678
0.223714
0.33825
0.78125
0.109292
1.032432
false
false
2024-03-14
2024-10-25
0
ministral/Ministral-3b-instruct
mistral-community_Mistral-7B-v0.2_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mistral-community/Mistral-7B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistral-community/Mistral-7B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistral-community__Mistral-7B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mistral-community/Mistral-7B-v0.2
2c3e624962b1a3f3fbf52e15969565caa7bc064a
14.215362
apache-2.0
232
7.242
true
false
false
false
1.106427
0.22664
22.663976
0.451019
23.950865
0.030211
3.021148
0.291946
5.592841
0.403177
8.363802
0.295296
21.699542
false
true
2024-03-23
2024-06-12
0
mistral-community/Mistral-7B-v0.2
mistral-community_Mixtral-8x22B-v0.1_float16
float16
❓ other
Original
Unknown
<a target="_blank" href="https://huggingface.co/mistral-community/Mixtral-8x22B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistral-community/Mixtral-8x22B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistral-community__Mixtral-8x22B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mistral-community/Mixtral-8x22B-v0.1
ab1e8c1950cf359e2a25de9b274ab836adb6dbab
16.82739
apache-2.0
674
0
true
true
false
false
15.173202
0.316656
31.665644
0.38
12.647903
0.154286
15.428571
0.33
10.666667
0.353333
1.666667
0.36
28.888889
false
true
2024-04-10
0
mistral-community/Mixtral-8x22B-v0.1
mistral-community_mixtral-8x22B-v0.3_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/mistral-community/mixtral-8x22B-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistral-community/mixtral-8x22B-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistral-community__mixtral-8x22B-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mistral-community/mixtral-8x22B-v0.3
211b177b79ab5ef245ee334d106c27623e786882
25.801995
apache-2.0
3
140.63
true
true
false
false
104.98897
0.258264
25.826363
0.625
45.731041
0.183535
18.353474
0.377517
17.002237
0.403698
7.46224
0.46393
40.436613
false
true
2024-05-25
2024-06-13
0
mistral-community/mixtral-8x22B-v0.3
mistralai_Codestral-22B-v0.1_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mistralai/Codestral-22B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Codestral-22B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Codestral-22B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mistralai/Codestral-22B-v0.1
8f5fe23af91885222a1563283c87416745a5e212
23.279917
other
1,236
22.247
true
false
false
true
2.613339
0.577175
57.717523
0.513914
30.737634
0.100453
10.045317
0.298658
6.487696
0.418708
10.738542
0.315575
23.952793
false
true
2024-05-29
2024-09-28
0
mistralai/Codestral-22B-v0.1
mistralai_Ministral-8B-Instruct-2410_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mistralai/Ministral-8B-Instruct-2410" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Ministral-8B-Instruct-2410</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Ministral-8B-Instruct-2410-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mistralai/Ministral-8B-Instruct-2410
199e57c1d66379760f6413f79d27008d1d1dbd6e
24.185603
other
447
8.02
true
false
false
true
1.594173
0.58964
58.963993
0.476164
25.824774
0.195619
19.561934
0.284396
4.58613
0.41375
10.71875
0.329122
25.458038
false
true
2024-10-15
2024-12-01
0
mistralai/Ministral-8B-Instruct-2410
mistralai_Mistral-7B-Instruct-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mistral-7B-Instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mistral-7B-Instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mistralai/Mistral-7B-Instruct-v0.1
73068f3702d050a2fd5aa2ca1e612e5036429398
12.771229
apache-2.0
1,601
7.242
true
false
false
true
1.862914
0.448706
44.87061
0.335481
7.647021
0.022659
2.265861
0.25
0
0.38476
6.128385
0.241439
15.715499
false
true
2023-09-27
2024-06-27
1
mistralai/Mistral-7B-v0.1
mistralai_Mistral-7B-Instruct-v0.2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mistral-7B-Instruct-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mistral-7B-Instruct-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mistralai/Mistral-7B-Instruct-v0.2
41b61a33a2483885c981aa79e0df6b32407ed873
18.507892
apache-2.0
2,688
7.242
true
false
false
true
1.068813
0.549623
54.962278
0.445974
22.910602
0.030211
3.021148
0.276007
3.467562
0.396604
7.608854
0.271692
19.076906
false
true
2023-12-11
2024-06-12
0
mistralai/Mistral-7B-Instruct-v0.2
mistralai_Mistral-7B-Instruct-v0.3_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mistral-7B-Instruct-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mistral-7B-Instruct-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mistralai/Mistral-7B-Instruct-v0.3
83e9aa141f2e28c82232fea5325f54edf17c43de
19.225099
apache-2.0
1,515
7.248
true
false
false
true
1.075567
0.546525
54.652544
0.472196
25.569115
0.03852
3.851964
0.279362
3.914989
0.373906
4.304948
0.307513
23.057033
false
true
2024-05-22
2024-06-12
1
mistralai/Mistral-7B-v0.3
mistralai_Mistral-7B-v0.1_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mistralai/Mistral-7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mistral-7B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mistral-7B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mistralai/Mistral-7B-v0.1
26bca36bde8333b5d7f72e9ed20ccda6a618af24
14.575359
apache-2.0
3,655
7.242
true
false
false
false
0.778074
0.238555
23.855481
0.44194
22.018255
0.029456
2.945619
0.291946
5.592841
0.413938
10.675521
0.30128
22.364436
false
true
2023-09-20
2024-06-12
0
mistralai/Mistral-7B-v0.1
mistralai_Mistral-7B-v0.3_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mistralai/Mistral-7B-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mistral-7B-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mistral-7B-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mistralai/Mistral-7B-v0.3
b67d6a03ca097c5122fa65904fce0413500bf8c8
14.229761
apache-2.0
450
7.248
true
false
false
false
0.762909
0.22664
22.663976
0.451685
24.037254
0.030211
3.021148
0.291946
5.592841
0.403177
8.363802
0.295296
21.699542
false
true
2024-05-22
2024-06-12
0
mistralai/Mistral-7B-v0.3
mistralai_Mistral-Large-Instruct-2411_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mistralai/Mistral-Large-Instruct-2411" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mistral-Large-Instruct-2411</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mistral-Large-Instruct-2411-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mistralai/Mistral-Large-Instruct-2411
3a5cb136f6106edf5c1210369068eb5a4f787cab
46.524214
other
211
122.61
true
false
false
true
52.54461
0.840058
84.005771
0.674665
52.744892
0.495468
49.546828
0.437081
24.944072
0.454
17.216667
0.556184
50.687057
false
true
2024-11-14
2024-11-19
0
mistralai/Mistral-Large-Instruct-2411
mistralai_Mistral-Nemo-Base-2407_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mistralai/Mistral-Nemo-Base-2407" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mistral-Nemo-Base-2407</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mistral-Nemo-Base-2407-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mistralai/Mistral-Nemo-Base-2407
d2efb15544d5401f761235bef327babb850887d0
15.239356
apache-2.0
299
11.58
true
false
false
false
3.405991
0.162992
16.299197
0.503506
29.374736
0.059668
5.966767
0.293624
5.816555
0.392135
6.516927
0.347158
27.461953
false
true
2024-07-18
2024-07-19
0
mistralai/Mistral-Nemo-Base-2407
mistralai_Mistral-Nemo-Instruct-2407_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mistral-Nemo-Instruct-2407</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mistral-Nemo-Instruct-2407-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mistralai/Mistral-Nemo-Instruct-2407
4d14c1db68fe20dbf80b8eca85d39b909c5fe1d5
24.6656
apache-2.0
1,495
12.248
true
false
false
true
4.468048
0.638025
63.802489
0.503652
29.67997
0.126888
12.688822
0.290268
5.369128
0.39
8.483333
0.351729
27.969858
false
true
2024-07-17
2024-08-29
1
mistralai/Mistral-Nemo-Base-2407
mistralai_Mistral-Small-24B-Base-2501_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mistralai/Mistral-Small-24B-Base-2501" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mistral-Small-24B-Base-2501</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mistral-Small-24B-Base-2501-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mistralai/Mistral-Small-24B-Base-2501
a8f19b61efa3c0a8d8a5f901ed48b30ff6b8c70e
27.19513
apache-2.0
231
23.572
true
false
false
false
4.275625
0.167238
16.723848
0.644186
48.537576
0.19713
19.712991
0.387584
18.344519
0.423667
10.891667
0.540642
48.96018
false
true
2025-01-23
2025-01-30
0
mistralai/Mistral-Small-24B-Base-2501
mistralai_Mistral-Small-Instruct-2409_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mistralai/Mistral-Small-Instruct-2409" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mistral-Small-Instruct-2409</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mistral-Small-Instruct-2409-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mistralai/Mistral-Small-Instruct-2409
63e53df6575e7085d62113f4383835ff979b3795
26.262749
other
382
22.05
true
false
false
true
1.379338
0.666976
66.697585
0.521308
30.792096
0.143505
14.350453
0.323826
9.8434
0.363208
3.001042
0.396027
32.891918
false
true
2024-09-17
2024-09-19
0
mistralai/Mistral-Small-Instruct-2409
mistralai_Mistral-Small-Instruct-2409_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mistralai/Mistral-Small-Instruct-2409" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mistral-Small-Instruct-2409</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mistral-Small-Instruct-2409-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mistralai/Mistral-Small-Instruct-2409
63e53df6575e7085d62113f4383835ff979b3795
29.918948
other
382
22.247
true
false
false
false
3.220015
0.628283
62.828296
0.583028
40.559713
0.203927
20.392749
0.333054
11.073826
0.406333
10.225
0.409907
34.434102
false
true
2024-09-17
2024-09-25
0
mistralai/Mistral-Small-Instruct-2409
mistralai_Mixtral-8x22B-Instruct-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/mistralai/Mixtral-8x22B-Instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mixtral-8x22B-Instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mixtral-8x22B-Instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mistralai/Mixtral-8x22B-Instruct-v0.1
b0c3516041d014f640267b14feb4e9a84c8e8c71
33.88568
apache-2.0
719
140.621
true
true
false
true
47.147579
0.718358
71.83584
0.612492
44.114346
0.187311
18.731118
0.373322
16.442953
0.431115
13.489323
0.448305
38.700502
false
true
2024-04-16
2024-06-12
1
mistralai/Mixtral-8x22B-v0.1
mistralai_Mixtral-8x22B-v0.1_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/mistralai/Mixtral-8x22B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mixtral-8x22B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mixtral-8x22B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mistralai/Mixtral-8x22B-v0.1
b03e260818710044a2f088d88fab12bb220884fb
25.740936
apache-2.0
215
140.621
true
true
false
false
157.565792
0.258264
25.826363
0.623981
45.588404
0.183535
18.353474
0.375839
16.778523
0.403698
7.46224
0.46393
40.436613
false
true
2024-04-16
2024-06-12
0
mistralai/Mixtral-8x22B-v0.1
mistralai_Mixtral-8x7B-Instruct-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mixtral-8x7B-Instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mixtral-8x7B-Instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mistralai/Mixtral-8x7B-Instruct-v0.1
1e637f2d7cb0a9d6fb1922f305cb784995190a83
23.817103
apache-2.0
4,352
46.703
true
true
false
true
17.484987
0.559914
55.991436
0.496237
29.742398
0.09139
9.138973
0.302852
7.04698
0.420323
11.073698
0.369182
29.909131
false
true
2023-12-10
2024-06-12
1
mistralai/Mixtral-8x7B-v0.1
mistralai_Mixtral-8x7B-v0.1_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/mistralai/Mixtral-8x7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mixtral-8x7B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mixtral-8x7B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mistralai/Mixtral-8x7B-v0.1
985aa055896a8f943d4a9f2572e6ea1341823841
19.565281
apache-2.0
1,693
46.703
true
true
false
false
23.377819
0.241527
24.152693
0.508667
30.294195
0.101964
10.196375
0.313758
8.501119
0.432135
12.583594
0.384973
31.663712
false
true
2023-12-01
2024-08-20
0
mistralai/Mixtral-8x7B-v0.1
mistralai_Mixtral-8x7B-v0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/mistralai/Mixtral-8x7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mistralai/Mixtral-8x7B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mistralai__Mixtral-8x7B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mistralai/Mixtral-8x7B-v0.1
985aa055896a8f943d4a9f2572e6ea1341823841
19.665109
apache-2.0
1,693
46.703
true
true
false
false
5.1351
0.232609
23.260948
0.509771
30.400299
0.093656
9.365559
0.32047
9.395973
0.441313
13.664063
0.387134
31.903812
false
true
2023-12-01
2024-06-27
0
mistralai/Mixtral-8x7B-v0.1
mixtao_MixTAO-7Bx2-MoE-v8.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/mixtao/MixTAO-7Bx2-MoE-v8.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mixtao/MixTAO-7Bx2-MoE-v8.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mixtao__MixTAO-7Bx2-MoE-v8.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mixtao/MixTAO-7Bx2-MoE-v8.1
339130b87b6ef2484fea9fbfacba8a714ac03347
21.077927
apache-2.0
55
12.879
true
true
false
false
1.84807
0.416233
41.623337
0.518906
32.310342
0.090634
9.063444
0.284396
4.58613
0.446333
15.291667
0.312334
23.592642
false
false
2024-02-26
2024-10-04
0
mixtao/MixTAO-7Bx2-MoE-v8.1
mkurman_llama-3.2-MEDIT-3B-o1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/mkurman/llama-3.2-MEDIT-3B-o1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mkurman/llama-3.2-MEDIT-3B-o1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mkurman__llama-3.2-MEDIT-3B-o1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mkurman/llama-3.2-MEDIT-3B-o1
b85c09ebdd588d98a0bc9daa52a78a4317d712db
17.028894
llama3.2
12
3.607
true
false
false
true
1.152686
0.438165
43.816518
0.439966
20.819346
0.130665
13.066465
0.26594
2.12528
0.356542
3.001042
0.274102
19.34471
false
false
2025-01-03
2025-01-07
1
mkurman/llama-3.2-MEDIT-3B-o1 (Merge)
mkurman_phi-4-MedIT-11B-exp-1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/mkurman/phi-4-MedIT-11B-exp-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mkurman/phi-4-MedIT-11B-exp-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mkurman__phi-4-MedIT-11B-exp-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mkurman/phi-4-MedIT-11B-exp-1
7ee6cc2dac29514784142da4d4d2bb4d77dc96dc
24.607235
0
11.514
false
false
false
true
1.599308
0.594761
59.476079
0.541394
34.737186
0.089879
8.987915
0.301174
6.823266
0.384792
6.232292
0.38248
31.386673
false
false
2025-01-10
0
Removed
mkurman_phi4-MedIT-10B-o1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaMedITForCausalLM
<a target="_blank" href="https://huggingface.co/mkurman/phi4-MedIT-10B-o1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mkurman/phi4-MedIT-10B-o1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mkurman__phi4-MedIT-10B-o1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mkurman/phi4-MedIT-10B-o1
2664ba0eb6272f27c6c8d88416ae6f9ace7ba01d
18.920738
mit
4
10.255
true
false
false
true
1.718869
0.346291
34.629117
0.51982
31.190281
0.114804
11.480363
0.245805
0
0.396792
8.365625
0.350731
27.859043
false
false
2025-01-17
2025-01-18
0
mkurman/phi4-MedIT-10B-o1
mkxu_llama-3-8b-instruct-fpo_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/mkxu/llama-3-8b-instruct-fpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mkxu/llama-3-8b-instruct-fpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mkxu__llama-3-8b-instruct-fpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mkxu/llama-3-8b-instruct-fpo
984bd038d56d9aa15ecb853111d2dfd8054a337e
23.813703
0
8.03
false
false
false
true
0.813464
0.679016
67.901612
0.495911
28.867565
0.073263
7.326284
0.277685
3.691275
0.365781
6.15599
0.360455
28.939495
false
false
2025-02-16
2025-02-18
0
mkxu/llama-3-8b-instruct-fpo
mkxu_llama-3-8b-po1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/mkxu/llama-3-8b-po1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mkxu/llama-3-8b-po1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mkxu__llama-3-8b-po1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mkxu/llama-3-8b-po1
1b16e10de696c43cd2b49fac9f6195dc551438ee
19.767002
0
8.03
false
false
false
false
1.024376
0.408115
40.811491
0.497609
29.181759
0.070242
7.024169
0.29698
6.263982
0.380417
6.852083
0.356217
28.468528
false
false
2024-11-29
2024-11-29
0
mkxu/llama-3-8b-po1
mlabonne_AlphaMonarch-7B_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/mlabonne/AlphaMonarch-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/AlphaMonarch-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__AlphaMonarch-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mlabonne/AlphaMonarch-7B
3de065d84411d74e5b3590f67f52b0b71faf6161
17.630621
cc-by-nc-4.0
148
7.242
true
false
false
true
1.145143
0.493944
49.394385
0.462552
23.947378
0.040785
4.07855
0.270134
2.684564
0.412135
9.316927
0.247257
16.361924
true
true
2024-02-14
2024-06-12
1
mlabonne/AlphaMonarch-7B (Merge)
mlabonne_Beyonder-4x7B-v3_float16
float16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/mlabonne/Beyonder-4x7B-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/Beyonder-4x7B-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__Beyonder-4x7B-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mlabonne/Beyonder-4x7B-v3
8e923fa480f511ab54d79b44b0487768bdd3de4e
19.406859
cc-by-nc-4.0
58
24.154
true
true
false
true
2.772605
0.560839
56.083857
0.467052
24.557209
0.053625
5.362538
0.285235
4.697987
0.404542
8.934375
0.251247
16.805186
true
true
2024-03-21
2024-06-12
1
mlabonne/Beyonder-4x7B-v3 (Merge)
mlabonne_BigQwen2.5-52B-Instruct_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/mlabonne/BigQwen2.5-52B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">mlabonne/BigQwen2.5-52B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/mlabonne__BigQwen2.5-52B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
mlabonne/BigQwen2.5-52B-Instruct
425b9bffc9871085cc0d42c34138ce776f96ba02
43.550005
apache-2.0
7
52.268
true
false
false
true
41.174805
0.791348
79.134807
0.7121
59.809607
0.547583
54.758308
0.302013
6.935123
0.411302
10.446094
0.551945
50.21609
true
true
2024-09-23
2024-09-25
1
mlabonne/BigQwen2.5-52B-Instruct (Merge)