eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
64 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.09k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.22
0.83
BBH
float64
0.25
76.7
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.7
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
525 values
Submission Date
stringclasses
263 values
Generation
int64
0
10
Base Model
stringlengths
4
102
Lunzima_NQLSG-Qwen2.5-14B-MegaFusion-v8_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lunzima__NQLSG-Qwen2.5-14B-MegaFusion-v8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8
351171919b27332a7686cdafd9fe8a380f1f055e
42.783691
apache-2.0
2
14.766
true
false
false
true
5.303605
0.787476
78.747612
0.641947
49.034138
0.555891
55.589124
0.33557
11.409396
0.439365
15.18724
0.520612
46.734634
true
false
2025-02-28
2025-03-09
1
Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8 (Merge)
Lunzima_NQLSG-Qwen2.5-14B-MegaFusion-v8.5_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lunzima__NQLSG-Qwen2.5-14B-MegaFusion-v8.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.5
770fd3503a058619e186a59966a093c92a25475c
38.229989
0
14.766
false
false
false
false
1.99894
0.592862
59.286249
0.645131
49.078225
0.365559
36.555891
0.380034
17.337808
0.476969
19.454427
0.529006
47.667332
false
false
2025-02-28
0
Removed
Lunzima_NQLSG-Qwen2.5-14B-MegaFusion-v8.6_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lunzima__NQLSG-Qwen2.5-14B-MegaFusion-v8.6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.6
bd6c996b6f7cd1f905e82064d7fc98612b2a5350
39.625747
0
14.766
false
false
false
false
1.991234
0.591938
59.193828
0.645717
48.985259
0.4071
40.70997
0.384228
17.897092
0.495323
22.082031
0.539977
48.886303
false
false
2025-03-01
0
Removed
Lunzima_NQLSG-Qwen2.5-14B-MegaFusion-v8.7_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lunzima__NQLSG-Qwen2.5-14B-MegaFusion-v8.7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.7
2ddb5cfe869eb6456ee9ea3cc19783db4ff7ab63
43.093411
apache-2.0
3
14.766
true
false
false
true
3.563218
0.787476
78.747612
0.648276
49.910093
0.540785
54.07855
0.35151
13.534676
0.438063
15.157812
0.524186
47.131723
true
false
2025-03-01
2025-03-09
1
Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.7 (Merge)
Lunzima_NQLSG-Qwen2.5-14B-MegaFusion-v8.8_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lunzima__NQLSG-Qwen2.5-14B-MegaFusion-v8.8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.8
356ebbe36a74f620d78bbf5b2554c31131ba5248
41.546551
1
14.766
false
false
false
false
1.759708
0.702796
70.279636
0.656563
50.746425
0.423716
42.371601
0.375839
16.778523
0.491198
21.066406
0.53233
48.036717
false
false
2025-03-01
2025-03-01
1
Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.8 (Merge)
Lunzima_NQLSG-Qwen2.5-14B-MegaFusion-v8.9_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.9</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lunzima__NQLSG-Qwen2.5-14B-MegaFusion-v8.9-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.9
91e8a2f67ea1e5b6f4719a451f7c3556340c6a8c
42.523099
apache-2.0
0
14.766
true
false
false
true
3.461802
0.799341
79.93413
0.64831
49.94687
0.537009
53.700906
0.329698
10.626398
0.432823
14.269531
0.519947
46.660757
true
false
2025-03-06
2025-03-09
0
Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v8.9
Lunzima_NQLSG-Qwen2.5-14B-MegaFusion-v9_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v9</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lunzima__NQLSG-Qwen2.5-14B-MegaFusion-v9-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v9
8f12f3e6fcecc948f848c6f7df9361933f39996a
38.8709
0
14.766
false
false
false
false
1.984922
0.52352
52.351982
0.654559
50.255502
0.436556
43.655589
0.388423
18.456376
0.480563
19.370313
0.542221
49.135638
false
false
2025-03-07
0
Removed
Lunzima_NQLSG-Qwen2.5-14B-MegaFusion-v9-stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v9-stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v9-stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lunzima__NQLSG-Qwen2.5-14B-MegaFusion-v9-stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v9-stock
5d8d588acc956b3bc575ae9d2b2b881ff60c13f7
40.705448
0
14.766
false
false
false
false
2.004717
0.651364
65.136394
0.657067
50.620019
0.418429
41.8429
0.384228
17.897092
0.481958
19.711458
0.541223
49.024823
false
false
2025-03-07
0
Removed
Lunzima_NQLSG-Qwen2.5-14B-MegaFusion-v9.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v9.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v9.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lunzima__NQLSG-Qwen2.5-14B-MegaFusion-v9.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v9.1
185ba0627a43d0ed1d0838ca65e09c4a9da061e3
43.309072
1
14.766
false
false
false
true
1.659595
0.800266
80.026552
0.655475
50.737878
0.546828
54.682779
0.343121
12.416107
0.435396
14.757812
0.5251
47.233304
false
false
2025-03-10
2025-03-10
1
Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v9.1 (Merge)
Lunzima_NQLSG-Qwen2.5-14B-MegaFusion-v9.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v9.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v9.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lunzima__NQLSG-Qwen2.5-14B-MegaFusion-v9.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v9.2
90d3b5a25b5f7aec6c2bce2cf6150d2565324d4a
43.225049
apache-2.0
1
14.766
true
false
false
true
1.680557
0.786227
78.622721
0.653769
50.455175
0.533233
53.323263
0.355705
14.09396
0.438094
15.261719
0.528341
47.593454
true
false
2025-03-12
2025-03-13
1
Lunzima/NQLSG-Qwen2.5-14B-MegaFusion-v9.2 (Merge)
Lunzima_NQLSG-Qwen2.5-14B-OriginalFusion_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Lunzima/NQLSG-Qwen2.5-14B-OriginalFusion" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lunzima/NQLSG-Qwen2.5-14B-OriginalFusion</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lunzima__NQLSG-Qwen2.5-14B-OriginalFusion-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lunzima/NQLSG-Qwen2.5-14B-OriginalFusion
d07dd0ea8d735f36c9c28625682da4d71c7ef871
40.609631
2
14.766
false
false
false
false
1.703825
0.614195
61.419478
0.659217
50.991749
0.427492
42.749245
0.380872
17.449664
0.512156
23.952865
0.523853
47.094784
false
false
2025-03-01
2025-03-01
1
Lunzima/NQLSG-Qwen2.5-14B-OriginalFusion (Merge)
Lyte_Llama-3.1-8B-Instruct-Reasoner-1o1_v0.3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Lyte/Llama-3.1-8B-Instruct-Reasoner-1o1_v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lyte/Llama-3.1-8B-Instruct-Reasoner-1o1_v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lyte__Llama-3.1-8B-Instruct-Reasoner-1o1_v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lyte/Llama-3.1-8B-Instruct-Reasoner-1o1_v0.3
35ab483f04afa763f36f978408f4f82e0379ee25
25.753461
0
8
false
false
false
true
1.831475
0.709816
70.981551
0.494952
27.835212
0.190332
19.033233
0.270134
2.684564
0.346125
4.898958
0.361785
29.087249
false
false
2024-09-17
0
Removed
Lyte_Llama-3.2-1B-Instruct-COT-RL-Expriement1-EP04_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Lyte/Llama-3.2-1B-Instruct-COT-RL-Expriement1-EP04" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lyte/Llama-3.2-1B-Instruct-COT-RL-Expriement1-EP04</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lyte__Llama-3.2-1B-Instruct-COT-RL-Expriement1-EP04-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lyte/Llama-3.2-1B-Instruct-COT-RL-Expriement1-EP04
59d93307c6f2cb7a29c593cbc7393122d502d1b1
14.647308
0
1.236
false
false
false
true
0.900267
0.57735
57.735032
0.351504
8.894409
0.08006
8.006042
0.260067
1.342282
0.323552
2.54401
0.184259
9.362072
false
false
2024-09-26
0
Removed
Lyte_Llama-3.2-3B-Overthinker_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Lyte/Llama-3.2-3B-Overthinker" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Lyte/Llama-3.2-3B-Overthinker</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Lyte__Llama-3.2-3B-Overthinker-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Lyte/Llama-3.2-3B-Overthinker
0e7af37fb3381365905fc2df24811c0e6d2ba5b2
21.167474
apache-2.0
20
3.213
true
false
false
true
1.467279
0.640798
64.079753
0.432009
20.095582
0.156344
15.634441
0.259228
1.230425
0.341906
3.904948
0.298537
22.059693
false
false
2024-10-17
2024-10-18
2
meta-llama/Llama-3.2-3B-Instruct
M4-ai_TinyMistral-248M-v3_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/M4-ai/TinyMistral-248M-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">M4-ai/TinyMistral-248M-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/M4-ai__TinyMistral-248M-v3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
M4-ai/TinyMistral-248M-v3
fa23fe617768c671f0bbbff1edf4556cfe844167
4.205636
apache-2.0
8
0.248
true
false
false
false
0.468367
0.163866
16.386632
0.288455
1.777554
0.004532
0.453172
0.240772
0
0.379333
5.15
0.113198
1.46646
false
false
2024-02-05
2024-10-18
0
M4-ai/TinyMistral-248M-v3
MEscriva_ECE-PRYMMAL-0.5B-FT-V5-MUSR-Mathis_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/MEscriva/ECE-PRYMMAL-0.5B-FT-V5-MUSR-Mathis" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MEscriva/ECE-PRYMMAL-0.5B-FT-V5-MUSR-Mathis</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MEscriva__ECE-PRYMMAL-0.5B-FT-V5-MUSR-Mathis-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MEscriva/ECE-PRYMMAL-0.5B-FT-V5-MUSR-Mathis
7a9d848188a674302d64a865786d4508be19571a
3.918739
0
0.63
false
false
false
false
2.103112
0.086629
8.662903
0.305729
3.237774
0.010574
1.057402
0.251678
0.223714
0.401719
8.614844
0.115442
1.715795
false
false
2024-11-19
0
Removed
MLP-KTLim_llama-3-Korean-Bllossom-8B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MLP-KTLim/llama-3-Korean-Bllossom-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MLP-KTLim/llama-3-Korean-Bllossom-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MLP-KTLim__llama-3-Korean-Bllossom-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MLP-KTLim/llama-3-Korean-Bllossom-8B
8a738f9f622ffc2b0a4a6b81dabbca80406248bf
20.396916
llama3
319
8.03
true
false
false
true
1.549441
0.51128
51.128007
0.490046
26.927528
0.101964
10.196375
0.262584
1.677852
0.367458
3.632292
0.359375
28.819444
false
false
2024-04-25
2024-07-09
1
MLP-KTLim/llama-3-Korean-Bllossom-8B (Merge)
MTSAIR_Cotype-Nano_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MTSAIR/Cotype-Nano" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MTSAIR/Cotype-Nano</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MTSAIR__Cotype-Nano-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MTSAIR/Cotype-Nano
91817ff717dd16d216304fa9d749e08fce2aa38d
13.812756
other
50
1.544
true
false
false
true
0.986576
0.374792
37.479222
0.386494
14.44687
0.097432
9.743202
0.270134
2.684564
0.328917
2.114583
0.247673
16.408097
false
false
2024-11-22
2024-12-01
0
MTSAIR/Cotype-Nano
MTSAIR_MultiVerse_70B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MTSAIR/MultiVerse_70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MTSAIR/MultiVerse_70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MTSAIR__MultiVerse_70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MTSAIR/MultiVerse_70B
063430cdc4d972a0884e3e3e3d45ea4afbdf71a2
32.244365
other
39
72.289
true
false
false
false
27.203635
0.524918
52.491833
0.618313
46.135899
0.192598
19.259819
0.354027
13.870246
0.47399
18.815365
0.486037
42.893026
false
false
2024-03-25
2024-06-29
0
MTSAIR/MultiVerse_70B
Magpie-Align_Llama-3-8B-Magpie-Align-SFT-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Magpie-Align/Llama-3-8B-Magpie-Align-SFT-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Magpie-Align/Llama-3-8B-Magpie-Align-SFT-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Magpie-Align__Llama-3-8B-Magpie-Align-SFT-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Magpie-Align/Llama-3-8B-Magpie-Align-SFT-v0.1
1ed587f54f70334f495efb9c027acb03e96fe24f
15.954088
llama3
4
8.03
true
false
false
true
1.667138
0.436142
43.614166
0.46151
23.990124
0.057402
5.740181
0.262584
1.677852
0.32774
0
0.28632
20.702202
false
false
2024-06-06
2024-09-17
1
meta-llama/Meta-Llama-3-8B
Magpie-Align_Llama-3-8B-Magpie-Align-SFT-v0.3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Magpie-Align/Llama-3-8B-Magpie-Align-SFT-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Magpie-Align/Llama-3-8B-Magpie-Align-SFT-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Magpie-Align__Llama-3-8B-Magpie-Align-SFT-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Magpie-Align/Llama-3-8B-Magpie-Align-SFT-v0.3
d2578eb754d1c20efe604749296580f680950917
17.553225
llama3
6
8.03
true
false
false
true
1.79084
0.506359
50.635868
0.457158
23.698816
0.073263
7.326284
0.26594
2.12528
0.342375
0.396875
0.290226
21.136229
false
false
2024-07-13
2024-08-06
1
meta-llama/Meta-Llama-3-8B
Magpie-Align_Llama-3-8B-Magpie-Align-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Magpie-Align/Llama-3-8B-Magpie-Align-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Magpie-Align/Llama-3-8B-Magpie-Align-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Magpie-Align__Llama-3-8B-Magpie-Align-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Magpie-Align/Llama-3-8B-Magpie-Align-v0.1
a83ddac146fb2da1dd1bfa4069e336074d1439a8
16.473094
llama3
10
8.03
true
false
false
true
0.906849
0.411812
41.181177
0.481144
26.691761
0.033988
3.398792
0.275168
3.355705
0.304698
1.920573
0.300615
22.290559
false
false
2024-06-29
2024-07-03
2
meta-llama/Meta-Llama-3-8B
Magpie-Align_Llama-3-8B-Magpie-Align-v0.1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Magpie-Align/Llama-3-8B-Magpie-Align-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Magpie-Align/Llama-3-8B-Magpie-Align-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Magpie-Align__Llama-3-8B-Magpie-Align-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Magpie-Align/Llama-3-8B-Magpie-Align-v0.1
a83ddac146fb2da1dd1bfa4069e336074d1439a8
16.484005
llama3
10
8.03
true
false
false
true
2.774581
0.402719
40.271923
0.478941
26.289712
0.046073
4.607251
0.276846
3.579418
0.308698
1.920573
0.300116
22.235151
false
false
2024-06-29
2024-07-03
2
meta-llama/Meta-Llama-3-8B
Magpie-Align_Llama-3-8B-Magpie-Align-v0.3_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Magpie-Align/Llama-3-8B-Magpie-Align-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Magpie-Align/Llama-3-8B-Magpie-Align-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Magpie-Align__Llama-3-8B-Magpie-Align-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Magpie-Align/Llama-3-8B-Magpie-Align-v0.3
7e420ddd6ff48bf213dcab2a9ddb7845b80dd1aa
17.402495
llama3
3
8.03
true
false
false
true
1.484579
0.449706
44.970567
0.456961
24.311447
0.056647
5.664653
0.265101
2.013423
0.340604
3.742188
0.313414
23.712692
false
false
2024-07-15
2024-08-06
2
meta-llama/Meta-Llama-3-8B
Magpie-Align_Llama-3.1-8B-Magpie-Align-SFT-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Magpie-Align/Llama-3.1-8B-Magpie-Align-SFT-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Magpie-Align/Llama-3.1-8B-Magpie-Align-SFT-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Magpie-Align__Llama-3.1-8B-Magpie-Align-SFT-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Magpie-Align/Llama-3.1-8B-Magpie-Align-SFT-v0.1
b191916912f0e76b2bdc93c46c0af590cc87e7ae
17.975799
llama3.1
2
8.03
true
false
false
true
2.749024
0.478207
47.820671
0.476416
26.136677
0.089879
8.987915
0.260906
1.454139
0.33974
1.866667
0.294299
21.588726
false
false
2024-07-23
2024-09-17
1
meta-llama/Meta-Llama-3.1-8B
Magpie-Align_Llama-3.1-8B-Magpie-Align-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Magpie-Align/Llama-3.1-8B-Magpie-Align-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Magpie-Align/Llama-3.1-8B-Magpie-Align-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Magpie-Align__Llama-3.1-8B-Magpie-Align-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Magpie-Align/Llama-3.1-8B-Magpie-Align-v0.1
dd34258a5f2bf7630b5a8e5662b050c60a088927
17.546855
llama3.1
3
8.03
true
false
false
true
1.416082
0.445784
44.578385
0.46224
24.040537
0.066465
6.646526
0.263423
1.789709
0.314062
3.091146
0.326213
25.134826
false
false
2024-07-24
2024-09-17
2
meta-llama/Meta-Llama-3.1-8B
Magpie-Align_MagpieLM-8B-Chat-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Magpie-Align/MagpieLM-8B-Chat-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Magpie-Align/MagpieLM-8B-Chat-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Magpie-Align__MagpieLM-8B-Chat-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Magpie-Align/MagpieLM-8B-Chat-v0.1
0b30eabc82a01fb42f44ba62c2dc81e1bd09cc04
15.026344
llama3.1
22
8.03
true
false
false
true
1.473753
0.370071
37.007141
0.417234
18.255805
0.061178
6.117825
0.261745
1.565996
0.350063
2.824479
0.319481
24.38682
false
false
2024-09-15
2024-09-19
2
meta-llama/Meta-Llama-3.1-8B
Magpie-Align_MagpieLM-8B-SFT-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Magpie-Align/MagpieLM-8B-SFT-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Magpie-Align/MagpieLM-8B-SFT-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Magpie-Align__MagpieLM-8B-SFT-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Magpie-Align/MagpieLM-8B-SFT-v0.1
b91f605a511707cb3b7f0893a8ed80c77b32d5a8
17.78393
llama3.1
3
8.03
true
false
false
true
1.600842
0.472062
47.206191
0.455285
23.612313
0.075529
7.55287
0.267617
2.348993
0.364885
3.877344
0.298953
22.105866
false
false
2024-09-15
2024-09-19
1
meta-llama/Meta-Llama-3.1-8B
MagusCorp_grpo_lora_enem_llama3_7b_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MagusCorp/grpo_lora_enem_llama3_7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MagusCorp/grpo_lora_enem_llama3_7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MagusCorp__grpo_lora_enem_llama3_7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MagusCorp/grpo_lora_enem_llama3_7b
a7782b3ab1954d78353985e1ae4a7cf24a651209
21.634388
apache-2.0
0
8.03
true
false
false
false
0.916609
0.472362
47.236222
0.480146
25.896379
0.121601
12.160121
0.309564
7.941834
0.397125
7.973958
0.35738
28.597813
false
false
2025-02-11
2025-02-11
3
meta-llama/Meta-Llama-3.1-8B
ManoloPueblo_ContentCuisine_1-7B-slerp_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ManoloPueblo/ContentCuisine_1-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ManoloPueblo/ContentCuisine_1-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ManoloPueblo__ContentCuisine_1-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ManoloPueblo/ContentCuisine_1-7B-slerp
e811e880075a2945623040ee43e9a6972675ff2e
21.052798
1
7.242
false
false
false
false
0.999012
0.390704
39.070444
0.518844
32.789744
0.073263
7.326284
0.302852
7.04698
0.467198
17.266406
0.305352
22.816933
false
false
2024-11-12
2024-11-12
1
ManoloPueblo/ContentCuisine_1-7B-slerp (Merge)
ManoloPueblo_LLM_MERGE_CC2_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ManoloPueblo/LLM_MERGE_CC2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ManoloPueblo/LLM_MERGE_CC2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ManoloPueblo__LLM_MERGE_CC2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ManoloPueblo/LLM_MERGE_CC2
a39dcd4e8175c0e2ab9bda2c7a4f377b97549644
20.747368
apache-2.0
1
7.242
true
false
false
false
1.147087
0.385309
38.530876
0.520937
33.241074
0.064199
6.41994
0.30453
7.270694
0.459292
16.444792
0.303191
22.576832
true
false
2024-11-02
2024-11-12
0
ManoloPueblo/LLM_MERGE_CC2
ManoloPueblo_LLM_MERGE_CC3_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ManoloPueblo/LLM_MERGE_CC3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ManoloPueblo/LLM_MERGE_CC3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ManoloPueblo__LLM_MERGE_CC3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ManoloPueblo/LLM_MERGE_CC3
79d2bd3866e363b9e700f59cfc573b2bc9de2442
21.67864
apache-2.0
1
7.242
true
false
false
false
1.073861
0.395875
39.587517
0.524629
33.230018
0.079305
7.930514
0.309564
7.941834
0.467167
17.429167
0.315575
23.952793
true
false
2024-11-10
2024-11-12
0
ManoloPueblo/LLM_MERGE_CC3
MarinaraSpaghetti_NemoReRemix-12B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/MarinaraSpaghetti/NemoReRemix-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MarinaraSpaghetti/NemoReRemix-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MarinaraSpaghetti__NemoReRemix-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MarinaraSpaghetti/NemoReRemix-12B
9ebc7c2d4577b663fb050d86ed91fb676eb2e1f2
22.034581
27
12.248
false
false
false
false
3.154015
0.334251
33.42509
0.553651
36.124702
0.090634
9.063444
0.317953
9.060403
0.450146
15.668229
0.359791
28.865618
false
false
2024-08-14
2024-09-17
1
MarinaraSpaghetti/NemoReRemix-12B (Merge)
MarinaraSpaghetti_Nemomix-v4.0-12B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/MarinaraSpaghetti/Nemomix-v4.0-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MarinaraSpaghetti/Nemomix-v4.0-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MarinaraSpaghetti__Nemomix-v4.0-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MarinaraSpaghetti/Nemomix-v4.0-12B
69fbd8449ce3e916fc257e982a78189308123074
24.467977
26
12.248
false
false
false
true
2.709097
0.557466
55.746641
0.527499
32.879943
0.108006
10.800604
0.291946
5.592841
0.424448
12.75599
0.361287
29.031841
false
false
2024-07-30
2024-08-02
1
MarinaraSpaghetti/Nemomix-v4.0-12B (Merge)
Marsouuu_MiniMathExpert-2_61B-ECE-PRYMMAL-Martial_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/Marsouuu/MiniMathExpert-2_61B-ECE-PRYMMAL-Martial" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Marsouuu/MiniMathExpert-2_61B-ECE-PRYMMAL-Martial</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Marsouuu__MiniMathExpert-2_61B-ECE-PRYMMAL-Martial-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Marsouuu/MiniMathExpert-2_61B-ECE-PRYMMAL-Martial
df21939a22e7233ebb7d62dfaf1c854facc5c772
12.49462
apache-2.0
1
2.614
true
false
false
false
2.90558
0.254842
25.48416
0.395273
15.297499
0.074018
7.401813
0.275168
3.355705
0.408323
9.273698
0.227394
14.154846
true
false
2024-10-06
2024-10-06
1
Marsouuu/MiniMathExpert-2_61B-ECE-PRYMMAL-Martial (Merge)
Marsouuu_MiniQwenMathExpert-ECE-PRYMMAL-Martial_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Marsouuu/MiniQwenMathExpert-ECE-PRYMMAL-Martial" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Marsouuu/MiniQwenMathExpert-ECE-PRYMMAL-Martial</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Marsouuu__MiniQwenMathExpert-ECE-PRYMMAL-Martial-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Marsouuu/MiniQwenMathExpert-ECE-PRYMMAL-Martial
0787682e65f7763ef978c4cf2e32803be8b49298
15.081989
0
1.777
false
false
false
false
1.378316
0.279496
27.949618
0.423013
19.019949
0.114048
11.404834
0.281879
4.250559
0.38674
6.509115
0.292221
21.357861
false
false
2024-10-07
2024-10-07
1
Marsouuu/MiniQwenMathExpert-ECE-PRYMMAL-Martial (Merge)
Marsouuu_MistralBase-4x7B-MoE-ECE-PRYMMAL-Martial_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/Marsouuu/MistralBase-4x7B-MoE-ECE-PRYMMAL-Martial" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Marsouuu/MistralBase-4x7B-MoE-ECE-PRYMMAL-Martial</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Marsouuu__MistralBase-4x7B-MoE-ECE-PRYMMAL-Martial-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Marsouuu/MistralBase-4x7B-MoE-ECE-PRYMMAL-Martial
9cb9e74d2a65abd6458dffac103ad99c3b8f5154
6.76176
apache-2.0
1
24.16
true
true
false
false
3.813018
0.169736
16.97363
0.346437
8.870227
0.01435
1.435045
0.259228
1.230425
0.399083
7.852083
0.137882
4.209146
true
false
2024-10-03
2024-10-03
1
Marsouuu/MistralBase-4x7B-MoE-ECE-PRYMMAL-Martial (Merge)
Marsouuu_general3B-ECE-PRYMMAL-Martial_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/Marsouuu/general3B-ECE-PRYMMAL-Martial" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Marsouuu/general3B-ECE-PRYMMAL-Martial</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Marsouuu__general3B-ECE-PRYMMAL-Martial-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Marsouuu/general3B-ECE-PRYMMAL-Martial
42992194a835a6fcad1edf1f94527ac08a7a60fb
22.978905
apache-2.0
0
3.821
true
false
false
false
1.448885
0.272227
27.222658
0.539435
35.700873
0.154834
15.483384
0.319631
9.284116
0.470052
18.223177
0.387633
31.95922
true
false
2024-10-23
2024-10-23
1
Marsouuu/general3B-ECE-PRYMMAL-Martial (Merge)
Marsouuu_general3Bv2-ECE-PRYMMAL-Martial_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Marsouuu/general3Bv2-ECE-PRYMMAL-Martial" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Marsouuu/general3Bv2-ECE-PRYMMAL-Martial</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Marsouuu__general3Bv2-ECE-PRYMMAL-Martial-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Marsouuu/general3Bv2-ECE-PRYMMAL-Martial
c6c5b3b0ecf9d04fc3a35bc4135df7cc08be3eb9
31.917859
apache-2.0
1
7.616
true
false
false
false
1.434263
0.569282
56.928173
0.563657
37.667763
0.367069
36.706949
0.310403
8.053691
0.439604
13.283854
0.449801
38.866726
true
false
2024-11-06
2024-11-06
1
Marsouuu/general3Bv2-ECE-PRYMMAL-Martial (Merge)
Marsouuu_lareneg1_78B-ECE-PRYMMAL-Martial_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Marsouuu/lareneg1_78B-ECE-PRYMMAL-Martial" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Marsouuu/lareneg1_78B-ECE-PRYMMAL-Martial</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Marsouuu__lareneg1_78B-ECE-PRYMMAL-Martial-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Marsouuu/lareneg1_78B-ECE-PRYMMAL-Martial
907a62bb805596e2105c9dca28c0e9ed1e9fd402
15.081989
apache-2.0
0
1.777
true
false
false
false
1.265784
0.279496
27.949618
0.423013
19.019949
0.114048
11.404834
0.281879
4.250559
0.38674
6.509115
0.292221
21.357861
true
false
2024-10-23
2024-10-23
1
Marsouuu/lareneg1_78B-ECE-PRYMMAL-Martial (Merge)
Marsouuu_lareneg3B-ECE-PRYMMAL-Martial_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/Marsouuu/lareneg3B-ECE-PRYMMAL-Martial" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Marsouuu/lareneg3B-ECE-PRYMMAL-Martial</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Marsouuu__lareneg3B-ECE-PRYMMAL-Martial-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Marsouuu/lareneg3B-ECE-PRYMMAL-Martial
2c8be0ac28ae27dd441298e83f19e17409d89f4e
23.942055
apache-2.0
0
3.821
true
false
false
false
0.981989
0.330329
33.032908
0.545333
36.350722
0.151813
15.181269
0.324664
9.955257
0.472469
18.391927
0.376662
30.740248
true
false
2024-11-06
2024-11-06
1
Marsouuu/lareneg3B-ECE-PRYMMAL-Martial (Merge)
Marsouuu_lareneg3Bv2-ECE-PRYMMAL-Martial_float16
float16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Marsouuu/lareneg3Bv2-ECE-PRYMMAL-Martial" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Marsouuu/lareneg3Bv2-ECE-PRYMMAL-Martial</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Marsouuu__lareneg3Bv2-ECE-PRYMMAL-Martial-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Marsouuu/lareneg3Bv2-ECE-PRYMMAL-Martial
ff92a6f314c392085af6c85f60a7da745e064653
32.112666
apache-2.0
2
7.616
true
false
false
false
1.331489
0.575327
57.53268
0.562336
37.47164
0.365559
36.555891
0.319631
9.284116
0.436938
12.817188
0.45113
39.01448
true
false
2024-11-06
2024-11-06
1
Marsouuu/lareneg3Bv2-ECE-PRYMMAL-Martial (Merge)
MaziyarPanahi_Calme-4x7B-MoE-v0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/Calme-4x7B-MoE-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/Calme-4x7B-MoE-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__Calme-4x7B-MoE-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/Calme-4x7B-MoE-v0.1
e2fab90eef37977002947684043f139a1660f519
20.023903
apache-2.0
2
24.154
true
true
false
false
2.721966
0.431521
43.152059
0.510282
31.261878
0.08006
8.006042
0.281879
4.250559
0.419885
10.61901
0.305685
22.853871
false
false
2024-03-17
2024-08-05
0
MaziyarPanahi/Calme-4x7B-MoE-v0.1
MaziyarPanahi_Calme-4x7B-MoE-v0.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/Calme-4x7B-MoE-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/Calme-4x7B-MoE-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__Calme-4x7B-MoE-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/Calme-4x7B-MoE-v0.2
ffef41baf94b3f88b30cf0aeb3fd72d9e4187161
20.176361
apache-2.0
2
24.154
true
true
false
false
2.831422
0.429447
42.94472
0.511077
31.39682
0.074018
7.401813
0.279362
3.914989
0.43176
12.536719
0.305768
22.863106
false
false
2024-03-17
2024-08-05
0
MaziyarPanahi/Calme-4x7B-MoE-v0.2
MaziyarPanahi_Llama-3-70B-Instruct-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/Llama-3-70B-Instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/Llama-3-70B-Instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__Llama-3-70B-Instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/Llama-3-70B-Instruct-v0.1
6db1cb4256525fc5429734ddc0eb941d08d0be30
26.333913
llama3
1
70.554
true
false
false
true
22.527972
0.471438
47.143801
0.536626
32.712917
0.180514
18.05136
0.284396
4.58613
0.443302
15.31276
0.461769
40.196513
false
false
2024-05-14
2024-06-26
2
meta-llama/Meta-Llama-3-70B
MaziyarPanahi_Llama-3-8B-Instruct-v0.10_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/Llama-3-8B-Instruct-v0.10" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/Llama-3-8B-Instruct-v0.10</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__Llama-3-8B-Instruct-v0.10-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/Llama-3-8B-Instruct-v0.10
4411eb9f6f5e4c462a6bdbc64c26dcc123100b66
26.797403
other
6
8.03
true
false
false
true
2.284261
0.766743
76.674335
0.492431
27.924674
0.057402
5.740181
0.308725
7.829978
0.421437
10.813021
0.38622
31.802231
false
false
2024-06-04
2024-06-26
4
meta-llama/Meta-Llama-3-8B-Instruct
MaziyarPanahi_Llama-3-8B-Instruct-v0.8_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/Llama-3-8B-Instruct-v0.8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/Llama-3-8B-Instruct-v0.8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__Llama-3-8B-Instruct-v0.8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/Llama-3-8B-Instruct-v0.8
94d222b8447b600b9836da4036df9490b59fe966
26.888884
other
8
8.03
true
false
false
true
3.624847
0.752755
75.275491
0.496278
28.270419
0.077795
7.779456
0.305369
7.38255
0.420198
10.92474
0.385306
31.70065
false
false
2024-05-01
2024-07-11
2
meta-llama/Meta-Llama-3-8B-Instruct
MaziyarPanahi_Llama-3-8B-Instruct-v0.9_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/Llama-3-8B-Instruct-v0.9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/Llama-3-8B-Instruct-v0.9</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__Llama-3-8B-Instruct-v0.9-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/Llama-3-8B-Instruct-v0.9
ddf91fdc0a3ab5e5d76864f1c4cf44e5adacd565
26.786644
other
6
8.03
true
false
false
true
1.532714
0.763046
76.304649
0.493613
27.903013
0.073263
7.326284
0.307886
7.718121
0.414802
9.85026
0.384558
31.617538
false
false
2024-05-30
2024-08-06
3
meta-llama/Meta-Llama-3-8B-Instruct
MaziyarPanahi_Qwen1.5-MoE-A2.7B-Wikihow_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2MoeForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/Qwen1.5-MoE-A2.7B-Wikihow" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/Qwen1.5-MoE-A2.7B-Wikihow</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__Qwen1.5-MoE-A2.7B-Wikihow-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/Qwen1.5-MoE-A2.7B-Wikihow
191cf0630b7b50fe1fc9be198e1f203935df1428
12.325435
apache-2.0
3
14.316
true
true
false
true
16.612166
0.295433
29.543279
0.392007
15.473439
0.082326
8.232628
0.275168
3.355705
0.350219
2.010677
0.238032
15.336879
false
false
2024-03-30
2024-09-12
1
Qwen/Qwen1.5-MoE-A2.7B
MaziyarPanahi_Qwen2-7B-Instruct-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/Qwen2-7B-Instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/Qwen2-7B-Instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__Qwen2-7B-Instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/Qwen2-7B-Instruct-v0.1
5123ecd76cefd4ef3b6009542b13e060d03e5232
22.981509
apache-2.0
1
7.616
true
false
false
false
2.907199
0.335225
33.522498
0.512306
31.923607
0.221299
22.129909
0.285235
4.697987
0.443479
13.868229
0.385721
31.746823
false
false
2024-06-27
2024-07-07
1
Qwen/Qwen2-7B
MaziyarPanahi_Qwen2-7B-Instruct-v0.8_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/Qwen2-7B-Instruct-v0.8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/Qwen2-7B-Instruct-v0.8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__Qwen2-7B-Instruct-v0.8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/Qwen2-7B-Instruct-v0.8
a6f9d0e11efcba18c905554ab43b877ead187a77
19.558138
apache-2.0
6
7.616
true
false
false
false
2.668213
0.277473
27.747266
0.463711
25.532525
0.176737
17.673716
0.293624
5.816555
0.429313
12.064063
0.356632
28.514702
false
false
2024-06-27
2024-07-07
1
Qwen/Qwen2-7B
MaziyarPanahi_calme-2.1-llama3.1-70b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.1-llama3.1-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.1-llama3.1-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.1-llama3.1-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.1-llama3.1-70b
f39ad1c90b0f30379e80756d29c6533cf84c362a
40.936033
4
70.554
false
false
false
true
30.909679
0.84343
84.342988
0.644755
48.553646
0.410121
41.012085
0.32802
10.402685
0.438031
13.720573
0.528258
47.58422
false
false
2024-07-23
2024-07-24
2
meta-llama/Meta-Llama-3.1-70B
MaziyarPanahi_calme-2.1-phi3-4b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.1-phi3-4b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.1-phi3-4b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.1-phi3-4b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.1-phi3-4b
6764c79badacba5fa3584d2d2593d762caa1d17d
25.985365
mit
1
3.821
true
false
false
true
1.504937
0.552521
55.252065
0.559532
38.12428
0.13142
13.141994
0.329698
10.626398
0.401531
8.258073
0.374584
30.509382
false
false
2024-05-09
2024-06-26
1
microsoft/Phi-3-mini-4k-instruct
MaziyarPanahi_calme-2.1-phi3.5-4b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.1-phi3.5-4b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.1-phi3.5-4b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.1-phi3.5-4b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.1-phi3.5-4b
583b7f382a8ed35f6f7c09f2950f0f2346945a83
28.000378
mit
4
3.821
true
false
false
true
2.009103
0.56591
56.590956
0.54837
36.110097
0.203927
20.392749
0.34396
12.527964
0.399458
9.765625
0.393534
32.614879
false
false
2024-08-23
2024-08-23
1
microsoft/Phi-3.5-mini-instruct
MaziyarPanahi_calme-2.1-qwen2-72b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.1-qwen2-72b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.1-qwen2-72b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.1-qwen2-72b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.1-qwen2-72b
0369c39770f45f2464587918f2dbdb8449ea3a0d
44.398944
other
28
72.699
true
false
false
true
26.269742
0.816277
81.627748
0.696556
57.325882
0.407855
40.785498
0.380872
17.449664
0.473219
20.152344
0.541473
49.052527
false
false
2024-06-08
2024-06-26
2
Qwen/Qwen2-72B
MaziyarPanahi_calme-2.1-qwen2-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.1-qwen2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.1-qwen2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.1-qwen2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.1-qwen2-7b
5aac57e2290f7c49af88a9cb9883ce25b58882a1
23.54188
apache-2.0
1
7.616
true
false
false
true
2.86851
0.381612
38.16119
0.504593
31.007097
0.231118
23.111782
0.28943
5.257271
0.443698
13.795573
0.369265
29.918366
false
false
2024-06-27
2024-09-18
1
Qwen/Qwen2-7B
MaziyarPanahi_calme-2.1-qwen2.5-72b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.1-qwen2.5-72b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.1-qwen2.5-72b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.1-qwen2.5-72b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.1-qwen2.5-72b
eb6c92dec932070ea872f39469ca5b9daf2d34e6
47.856722
other
1
72.7
true
false
false
true
29.497787
0.866236
86.623603
0.726162
61.655703
0.59139
59.138973
0.363255
15.100671
0.429844
13.297135
0.561918
51.324246
false
false
2024-09-19
2024-09-26
1
Qwen/Qwen2.5-72B
MaziyarPanahi_calme-2.1-rys-78b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.1-rys-78b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.1-rys-78b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.1-rys-78b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.1-rys-78b
e746f5ddc0c9b31a2382d985a4ec87fa910847c7
44.643999
mit
3
77.965
true
false
false
true
28.664577
0.813555
81.35547
0.709786
59.470031
0.39426
39.425982
0.394295
19.239374
0.469313
18.997396
0.544382
49.375739
false
false
2024-08-06
2024-08-08
1
dnhkng/RYS-XLarge
MaziyarPanahi_calme-2.2-llama3-70b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.2-llama3-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.2-llama3-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.2-llama3-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.2-llama3-70b
95366b974baedee4d95c1e841bc3d15e94753804
38.140064
llama3
17
70.554
true
false
false
true
21.256547
0.820849
82.084868
0.643543
48.571706
0.239426
23.942598
0.341443
12.192394
0.444573
15.304948
0.520695
46.743868
false
false
2024-04-27
2024-06-26
2
meta-llama/Meta-Llama-3-70B
MaziyarPanahi_calme-2.2-llama3.1-70b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.2-llama3.1-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.2-llama3.1-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.2-llama3.1-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.2-llama3.1-70b
c81ac05ed2c2344e9fd366cfff197da406ef5234
43.311007
2
70.554
false
false
false
true
31.683648
0.859267
85.926675
0.679292
54.206462
0.436556
43.655589
0.324664
9.955257
0.454156
17.069531
0.541473
49.052527
false
false
2024-09-09
2024-09-09
2
meta-llama/Meta-Llama-3.1-70B
MaziyarPanahi_calme-2.2-phi3-4b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.2-phi3-4b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.2-phi3-4b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.2-phi3-4b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.2-phi3-4b
c0a366a4c01d7e724ceba7e2f2c19251983423fe
25.232641
mit
2
3.821
true
false
false
true
1.593583
0.506908
50.690834
0.55296
37.733734
0.145015
14.501511
0.321309
9.50783
0.397563
7.695313
0.3814
31.266622
false
false
2024-05-10
2024-06-26
1
microsoft/Phi-3-mini-4k-instruct
MaziyarPanahi_calme-2.2-qwen2-72b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.2-qwen2-72b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.2-qwen2-72b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.2-qwen2-72b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.2-qwen2-72b
529e9bd80a76d943409bc92bb246aa7ca63dd9e6
44.090096
other
5
72.706
true
false
false
true
27.034829
0.800815
80.081517
0.69396
56.795942
0.453172
45.317221
0.374161
16.55481
0.450802
16.516927
0.543467
49.274158
false
false
2024-07-09
2024-08-06
1
Qwen/Qwen2-72B
MaziyarPanahi_calme-2.2-qwen2-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.2-qwen2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.2-qwen2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.2-qwen2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.2-qwen2-7b
bbb1d119f75c5b2eaa8978286808bd59cae04997
23.583319
apache-2.0
1
7.616
true
false
false
true
3.097499
0.35973
35.972996
0.521491
33.109366
0.214502
21.450151
0.291107
5.480984
0.435823
13.277865
0.389877
32.208555
false
false
2024-06-27
2024-09-18
1
Qwen/Qwen2-7B
MaziyarPanahi_calme-2.2-qwen2.5-72b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.2-qwen2.5-72b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.2-qwen2.5-72b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.2-qwen2.5-72b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.2-qwen2.5-72b
c6c7fdf70d8bf81364108975eb8ba78eecac83d4
47.224577
other
6
72.7
true
false
false
true
28.516128
0.847676
84.767639
0.72764
61.803604
0.589124
58.912387
0.35906
14.541387
0.420667
12.016667
0.561752
51.305777
false
false
2024-09-19
2024-09-26
1
Qwen/Qwen2.5-72B
MaziyarPanahi_calme-2.2-rys-78b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.2-rys-78b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.2-rys-78b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.2-rys-78b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.2-rys-78b
8d0dde25c9042705f65559446944a19259c3fc8e
44.386334
mit
3
77.965
true
false
false
true
27.046713
0.798642
79.864205
0.708101
59.268646
0.4071
40.70997
0.406879
20.917226
0.453563
16.828646
0.538564
48.729314
false
false
2024-08-06
2024-08-08
1
dnhkng/RYS-XLarge
MaziyarPanahi_calme-2.3-llama3-70b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.3-llama3-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.3-llama3-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.3-llama3-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.3-llama3-70b
bd17453eaae0e36d1e1e17da13fdd155fce91a29
37.067032
llama3
4
70.554
true
false
false
true
19.273618
0.80104
80.104013
0.639917
48.008585
0.232628
23.26284
0.338087
11.744966
0.426125
12.565625
0.520445
46.716164
false
false
2024-04-27
2024-08-30
2
meta-llama/Meta-Llama-3-70B
MaziyarPanahi_calme-2.3-llama3.1-70b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.3-llama3.1-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.3-llama3.1-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.3-llama3.1-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.3-llama3.1-70b
a39c79250721b75beefa1b1763895eafd010f6f6
43.27519
3
70.554
false
false
false
true
28.121116
0.860466
86.046579
0.687165
55.585495
0.392749
39.274924
0.34396
12.527964
0.456823
17.736198
0.53632
48.479979
false
false
2024-09-10
2024-09-18
2
meta-llama/Meta-Llama-3.1-70B
MaziyarPanahi_calme-2.3-phi3-4b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.3-phi3-4b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.3-phi3-4b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.3-phi3-4b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.3-phi3-4b
e1f70c3724c728aadd1c7c1bb279487494f7059e
24.981613
mit
9
3.821
true
false
false
true
1.675862
0.492645
49.264507
0.553787
37.658892
0.147281
14.728097
0.317953
9.060403
0.398833
7.754167
0.382813
31.423611
false
false
2024-05-10
2024-06-26
1
microsoft/Phi-3-mini-4k-instruct
MaziyarPanahi_calme-2.3-qwen2-72b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.3-qwen2-72b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.3-qwen2-72b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.3-qwen2-72b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.3-qwen2-72b
12ff2e800f968e867a580c072905cf4671da066f
33.000831
other
2
72.706
true
false
false
true
38.89697
0.384984
38.498406
0.657631
51.228304
0.317221
31.722054
0.371644
16.219239
0.41124
11.238281
0.541888
49.0987
false
false
2024-08-06
2024-09-15
1
Qwen/Qwen2-72B
MaziyarPanahi_calme-2.3-qwen2-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.3-qwen2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.3-qwen2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.3-qwen2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.3-qwen2-7b
ca39e60052a600a709e03fefceabd9620e0b66d7
23.081701
apache-2.0
2
7.616
true
false
false
true
3.6604
0.382486
38.248625
0.506405
30.956082
0.206949
20.694864
0.29698
6.263982
0.44224
13.313281
0.36112
29.013372
false
false
2024-06-27
2024-09-18
1
Qwen/Qwen2-7B
MaziyarPanahi_calme-2.3-rys-78b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.3-rys-78b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.3-rys-78b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.3-rys-78b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.3-rys-78b
a8a4e55c2f7054d25c2f0ab3a3b3d806eb915180
44.557374
mit
4
77.965
true
false
false
true
26.597219
0.806585
80.658542
0.710776
59.574547
0.398036
39.803625
0.404362
20.581655
0.454927
16.999219
0.54754
49.726655
false
false
2024-08-06
2024-09-03
1
dnhkng/RYS-XLarge
MaziyarPanahi_calme-2.4-llama3-70b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.4-llama3-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.4-llama3-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.4-llama3-70b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.4-llama3-70b
cb03e4d810b82d86e7cb01ab146bade09a5d06d1
32.486225
llama3
14
70.554
true
false
false
true
35.487517
0.502737
50.273718
0.641819
48.397766
0.244713
24.471299
0.339765
11.96868
0.428792
13.098958
0.520362
46.70693
false
false
2024-04-28
2024-06-26
2
meta-llama/Meta-Llama-3-70B
MaziyarPanahi_calme-2.4-qwen2-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.4-qwen2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.4-qwen2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.4-qwen2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.4-qwen2-7b
d683c3ef1feb13e92227f5fd92fe5bc4b55ea4a2
22.851441
apache-2.0
1
7.616
true
false
false
true
3.236981
0.329955
32.995452
0.510142
31.818266
0.203172
20.317221
0.283557
4.474273
0.445281
14.426823
0.397689
33.076611
false
false
2024-06-27
2024-09-18
1
Qwen/Qwen2-7B
MaziyarPanahi_calme-2.4-rys-78b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.4-rys-78b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.4-rys-78b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.4-rys-78b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.4-rys-78b
0a35e51ffa9efa644c11816a2d56434804177acb
50.765047
mit
46
77.965
true
false
false
true
25.952656
0.80109
80.109
0.727951
62.156549
0.4071
40.70997
0.402685
20.357942
0.577062
34.566146
0.700216
66.690677
false
false
2024-08-07
2024-09-03
2
dnhkng/RYS-XLarge
MaziyarPanahi_calme-2.5-qwen2-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.5-qwen2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.5-qwen2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.5-qwen2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.5-qwen2-7b
20fb1afc22c0722cb2c57185fff59befeba0fbec
22.659539
apache-2.0
1
7.616
true
false
false
true
2.798338
0.314492
31.449221
0.488656
28.280995
0.225831
22.583082
0.310403
8.053691
0.456469
15.791927
0.368185
29.798316
false
false
2024-06-27
2024-09-29
1
Qwen/Qwen2-7B
MaziyarPanahi_calme-2.6-qwen2-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.6-qwen2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.6-qwen2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.6-qwen2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.6-qwen2-7b
ebfaae016a50f8922098a2a262ec3ca704504cae
21.232331
apache-2.0
2
7.616
true
false
false
true
3.280542
0.344268
34.426765
0.493024
29.308419
0.121601
12.160121
0.284396
4.58613
0.458615
16.560156
0.373172
30.352394
false
false
2024-06-27
2024-09-29
1
Qwen/Qwen2-7B
MaziyarPanahi_calme-2.7-qwen2-7b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-2.7-qwen2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-2.7-qwen2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-2.7-qwen2-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-2.7-qwen2-7b
edc11a1baccedc04a5a4576ee4910fd8922ad47f
22.355267
apache-2.0
2
7.616
true
false
false
true
2.728561
0.35923
35.923018
0.488317
28.912245
0.138218
13.821752
0.291107
5.480984
0.482427
19.936719
0.370512
30.056885
false
false
2024-06-27
2024-09-18
1
Qwen/Qwen2-7B
MaziyarPanahi_calme-3.1-baguette-3b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-3.1-baguette-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-3.1-baguette-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-3.1-baguette-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-3.1-baguette-3b
4601b18deed3931c33907ae98060898e787c7758
25.581602
other
1
3.085
true
false
false
true
1.463179
0.623437
62.343693
0.468333
25.507681
0.256042
25.60423
0.286074
4.809843
0.400792
8.565625
0.339927
26.65854
false
false
2024-11-07
2024-11-08
1
Qwen/Qwen2.5-3B
MaziyarPanahi_calme-3.1-instruct-3b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-3.1-instruct-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-3.1-instruct-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-3.1-instruct-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-3.1-instruct-3b
3bbd7f1f7949dd7c3679a29a781a95bd1085dc19
21.507091
other
3
3.085
true
false
false
true
2.787324
0.433594
43.359398
0.481273
27.309896
0.177492
17.749245
0.286074
4.809843
0.395208
7.401042
0.355718
28.413121
false
false
2024-11-07
2024-11-08
1
Qwen/Qwen2.5-3B
MaziyarPanahi_calme-3.1-instruct-78b_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-3.1-instruct-78b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-3.1-instruct-78b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-3.1-instruct-78b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-3.1-instruct-78b
7ccd7f1a55ae79af7969f721bb7055511cc6b986
51.28749
other
4
77.965
true
false
false
true
64.437889
0.813555
81.35547
0.730515
62.409683
0.392749
39.274924
0.395973
19.463087
0.589062
36.499479
0.718501
68.722296
false
false
2024-11-19
2024-11-27
1
Removed
MaziyarPanahi_calme-3.1-llamaloi-3b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-3.1-llamaloi-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-3.1-llamaloi-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-3.1-llamaloi-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-3.1-llamaloi-3b
62547548c06bb22f0b82c2bda7ac466507314a4b
24.093351
llama3.2
1
3.213
true
false
false
true
1.787398
0.737518
73.751756
0.458734
23.769166
0.172961
17.296073
0.28104
4.138702
0.351521
1.106771
0.320479
24.497636
false
false
2024-11-07
2024-11-08
1
meta-llama/Llama-3.2-3B
MaziyarPanahi_calme-3.2-baguette-3b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-3.2-baguette-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-3.2-baguette-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-3.2-baguette-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-3.2-baguette-3b
bba8e602432bd467b64cabf9cb62326893060e60
26.332491
other
1
3.085
true
false
false
true
1.552025
0.633828
63.382824
0.470862
25.865747
0.282477
28.247734
0.294463
5.928412
0.402094
8.595052
0.333777
25.975177
false
false
2024-11-07
2024-11-08
1
Qwen/Qwen2.5-3B
MaziyarPanahi_calme-3.2-instruct-3b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-3.2-instruct-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-3.2-instruct-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-3.2-instruct-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-3.2-instruct-3b
12347f5991157e752de6ba9f773a1bbc22445e3a
24.620352
other
3
3.086
true
false
false
true
1.486866
0.55332
55.331964
0.486564
27.976798
0.216767
21.676737
0.283557
4.474273
0.404698
8.78724
0.365276
29.475103
false
false
2024-11-07
2024-11-08
1
Qwen/Qwen2.5-3B
MaziyarPanahi_calme-3.2-instruct-78b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-3.2-instruct-78b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-3.2-instruct-78b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-3.2-instruct-78b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-3.2-instruct-78b
731f4daf584f822f1393731ccff1d58c7f06b99e
52.081384
other
112
77.965
true
false
false
true
66.011131
0.806261
80.626072
0.731862
62.609443
0.403323
40.332326
0.402685
20.357942
0.602365
38.528906
0.730303
70.033614
false
false
2024-11-19
2024-11-28
1
Removed
MaziyarPanahi_calme-3.3-baguette-3b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-3.3-baguette-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-3.3-baguette-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-3.3-baguette-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-3.3-baguette-3b
66f9438922503e5616b6b4488e96fd9342d5efb0
27.407101
other
1
3.086
true
false
false
true
1.493846
0.635951
63.59515
0.467822
25.596594
0.380665
38.066465
0.280201
4.026846
0.392823
7.136198
0.334192
26.02135
false
false
2024-11-07
2024-11-08
1
Qwen/Qwen2.5-3B
MaziyarPanahi_calme-3.3-instruct-3b_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/MaziyarPanahi/calme-3.3-instruct-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MaziyarPanahi/calme-3.3-instruct-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MaziyarPanahi__calme-3.3-instruct-3b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MaziyarPanahi/calme-3.3-instruct-3b
ea7d7fb442c981ecd44c5a9060ac6b062927f231
27.778911
other
3
3.086
true
false
false
true
1.505097
0.642321
64.232126
0.469334
25.682138
0.373867
37.386707
0.282718
4.362416
0.407427
9.395052
0.330535
25.615027
false
false
2024-11-07
2024-11-08
1
Qwen/Qwen2.5-3B
Minami-su_Amara-o1-7B-Qwen_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Minami-su/Amara-o1-7B-Qwen" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Minami-su/Amara-o1-7B-Qwen</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Minami-su__Amara-o1-7B-Qwen-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Minami-su/Amara-o1-7B-Qwen
835395d4e693cd8cfb5143f12fae53673164846f
34.488977
apache-2.0
1
7.616
true
false
false
true
1.275119
0.738991
73.899143
0.519942
32.79683
0.518127
51.812689
0.293624
5.816555
0.400667
8.35
0.408328
34.258644
false
false
2025-01-08
2025-01-08
0
Minami-su/Amara-o1-7B-Qwen
Minami-su_Amara-o2-7B-Qwen_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Minami-su/Amara-o2-7B-Qwen" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Minami-su/Amara-o2-7B-Qwen</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Minami-su__Amara-o2-7B-Qwen-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Minami-su/Amara-o2-7B-Qwen
16271c35b4e1b33dbf9da9c567a1cea0f9d5142b
31.034507
apache-2.0
3
7.616
true
false
false
true
0.984808
0.714662
71.466154
0.517343
31.798127
0.40861
40.861027
0.263423
1.789709
0.378094
5.128385
0.416473
35.163638
false
false
2025-01-09
2025-01-09
0
Minami-su/Amara-o2-7B-Qwen
Minami-su_test-7B-00_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Minami-su/test-7B-00" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Minami-su/test-7B-00</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Minami-su__test-7B-00-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Minami-su/test-7B-00
c8e5b7745c921b5020192f0b3a553c63725048f9
29.950834
0
7.616
false
false
false
true
1.501208
0.669049
66.904923
0.446612
21.48995
0.451662
45.166163
0.302852
7.04698
0.412604
10.342188
0.358793
28.754802
false
false
2024-12-24
0
Removed
Minami-su_test-7B-01_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Minami-su/test-7B-01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Minami-su/test-7B-01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Minami-su__test-7B-01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Minami-su/test-7B-01
9af628070dff4480252a4d4e5f07a9884e3f71d4
30.081509
0
7.616
false
false
false
true
1.529037
0.67362
67.362044
0.442236
20.824494
0.455438
45.543807
0.307047
7.606264
0.415302
10.979427
0.353557
28.17302
false
false
2024-12-24
0
Removed
Minami-su_test-v2-7B-00_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Minami-su/test-v2-7B-00" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Minami-su/test-v2-7B-00</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Minami-su__test-v2-7B-00-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Minami-su/test-v2-7B-00
fcba9bb21f9cf521dcd5d41749ccce77434fe4dc
29.48435
0
7.616
false
false
false
true
1.53345
0.67472
67.471974
0.441599
21.190755
0.441843
44.18429
0.291946
5.592841
0.415427
10.995052
0.347241
27.471188
false
false
2024-12-25
0
Removed
ModelCloud_Llama-3.2-1B-Instruct-gptqmodel-4bit-vortex-v1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ModelCloud/Llama-3.2-1B-Instruct-gptqmodel-4bit-vortex-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ModelCloud/Llama-3.2-1B-Instruct-gptqmodel-4bit-vortex-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ModelCloud__Llama-3.2-1B-Instruct-gptqmodel-4bit-vortex-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ModelCloud/Llama-3.2-1B-Instruct-gptqmodel-4bit-vortex-v1
c7a837a34207b3b3b949f8d0344a66d3a3ad4255
12.430004
llama3.2
2
5.453
true
false
false
true
1.321828
0.526892
52.689198
0.325273
5.859172
0.060423
6.042296
0.253356
0.447427
0.324917
1.047917
0.176446
8.494016
false
false
2024-10-30
2025-01-02
1
ModelCloud/Llama-3.2-1B-Instruct-gptqmodel-4bit-vortex-v1 (Merge)
ModelSpace_GemmaX2-28-9B-v0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/ModelSpace/GemmaX2-28-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ModelSpace/GemmaX2-28-9B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ModelSpace__GemmaX2-28-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ModelSpace/GemmaX2-28-9B-v0.1
cdb5e2e66bd966cd475ef24f7a9eb61c55b25bf7
5.991109
gemma
47
10.159
true
false
false
false
1.066059
0.003922
0.392182
0.368723
11.707677
0.02719
2.719033
0.276846
3.579418
0.353656
3.873698
0.223072
13.674645
false
false
2024-11-28
2025-02-24
1
ModelSpace/GemmaX2-28-9B-v0.1 (Merge)
MoonRide_Llama-3.2-3B-Khelavaster_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MoonRide/Llama-3.2-3B-Khelavaster" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MoonRide/Llama-3.2-3B-Khelavaster</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MoonRide__Llama-3.2-3B-Khelavaster-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MoonRide/Llama-3.2-3B-Khelavaster
30c3794ab0ff5351bcd7586da8b5384adc77c775
20.144905
llama3.2
1
3.607
true
false
false
true
0.576218
0.492495
49.249547
0.451567
22.686344
0.161631
16.163142
0.277685
3.691275
0.369906
5.504948
0.312168
23.574173
true
false
2025-03-10
2025-03-10
1
MoonRide/Llama-3.2-3B-Khelavaster (Merge)
Mostafa8Mehrabi_llama-3.2-1b-Insomnia-ChatBot-merged_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Mostafa8Mehrabi/llama-3.2-1b-Insomnia-ChatBot-merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Mostafa8Mehrabi/llama-3.2-1b-Insomnia-ChatBot-merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Mostafa8Mehrabi__llama-3.2-1b-Insomnia-ChatBot-merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Mostafa8Mehrabi/llama-3.2-1b-Insomnia-ChatBot-merged
301ab532cd25a048a183768891b7cf095b61dd9f
3.192716
1
1.236
false
false
false
true
0.776373
0.132067
13.206736
0.300351
2.467518
0.007553
0.755287
0.236577
0
0.338156
1.269531
0.113115
1.457225
false
false
2025-03-10
2025-03-10
0
Mostafa8Mehrabi/llama-3.2-1b-Insomnia-ChatBot-merged
MrRobotoAI_MrRoboto-ProLong-8b-v4i_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MrRobotoAI/MrRoboto-ProLong-8b-v4i" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MrRobotoAI/MrRoboto-ProLong-8b-v4i</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MrRobotoAI__MrRoboto-ProLong-8b-v4i-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MrRobotoAI/MrRoboto-ProLong-8b-v4i
aed2a68b257f1c12bf75ae7d98f6ab2d235e1061
17.582919
0
4.015
false
false
false
false
1.158625
0.38346
38.346033
0.458549
23.792249
0.055136
5.513595
0.28943
5.257271
0.401375
9.605208
0.306848
22.983156
false
false
2024-12-29
0
Removed
MrRobotoAI_MrRoboto-ProLongBASE-pt8-unaligned-8b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MrRobotoAI/MrRoboto-ProLongBASE-pt8-unaligned-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MrRobotoAI/MrRoboto-ProLongBASE-pt8-unaligned-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MrRobotoAI__MrRoboto-ProLongBASE-pt8-unaligned-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MrRobotoAI/MrRoboto-ProLongBASE-pt8-unaligned-8b
7254c57bd7b62a66572a5fed9ef9451fd81b7e9b
15.935346
0
4.015
false
false
false
false
1.451374
0.34754
34.754008
0.451525
22.94121
0.042296
4.229607
0.28104
4.138702
0.427885
12.152344
0.256566
17.396203
false
false
2024-12-29
0
Removed
MultivexAI_Gladiator-Mini-Exp-1211-3B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MultivexAI/Gladiator-Mini-Exp-1211-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MultivexAI/Gladiator-Mini-Exp-1211-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MultivexAI__Gladiator-Mini-Exp-1211-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MultivexAI/Gladiator-Mini-Exp-1211-3B
9f3f58da3fb4b1825c2b97effc421e7809c95848
22.27221
mit
0
3.213
true
false
false
true
1.191563
0.687609
68.760888
0.448438
22.116062
0.137462
13.746224
0.272651
3.020134
0.326
2.083333
0.31516
23.906619
false
false
2024-12-11
2024-12-11
1
MultivexAI/Gladiator-Mini-Exp-1211-3B (Merge)
MultivexAI_Gladiator-Mini-Exp-1221-3B-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MultivexAI/Gladiator-Mini-Exp-1221-3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MultivexAI/Gladiator-Mini-Exp-1221-3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MultivexAI__Gladiator-Mini-Exp-1221-3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MultivexAI/Gladiator-Mini-Exp-1221-3B-Instruct
1a3f3808cd5335fb71c88d3c2b681459c2420044
20.114352
mit
0
3.213
true
false
false
true
1.202138
0.607875
60.787488
0.436977
20.395462
0.135196
13.519637
0.263423
1.789709
0.311458
1.432292
0.304854
22.761525
false
false
2024-12-20
2024-12-20
1
MultivexAI/Gladiator-Mini-Exp-1221-3B-Instruct (Merge)
MultivexAI_Gladiator-Mini-Exp-1221-3B-Instruct-V2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/MultivexAI/Gladiator-Mini-Exp-1221-3B-Instruct-V2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">MultivexAI/Gladiator-Mini-Exp-1221-3B-Instruct-V2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/MultivexAI__Gladiator-Mini-Exp-1221-3B-Instruct-V2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
MultivexAI/Gladiator-Mini-Exp-1221-3B-Instruct-V2
b092e130c61aa44f2556b5db224b4df545fb51aa
20.415196
mit
0
3.213
true
false
false
true
1.181576
0.621539
62.153863
0.438883
20.651248
0.141239
14.123867
0.263423
1.789709
0.300823
1.269531
0.302527
22.502955
false
false
2024-12-21
2024-12-21
1
MultivexAI/Gladiator-Mini-Exp-1221-3B-Instruct-V2 (Merge)