eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
7 values
T
stringclasses
7 values
Weight type
stringclasses
2 values
Architecture
stringclasses
64 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
0.74
52.1
Hub License
stringclasses
27 values
Hub ❤️
int64
0
6.09k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.04
187
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.22
0.83
BBH
float64
0.25
76.7
MATH Lvl 5 Raw
float64
0
0.71
MATH Lvl 5
float64
0
71.5
GPQA Raw
float64
0.21
0.47
GPQA
float64
0
29.4
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.7
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
525 values
Submission Date
stringclasses
263 values
Generation
int64
0
10
Base Model
stringlengths
4
102
prithivMLmods_Evac-Opus-14B-Exp_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Evac-Opus-14B-Exp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Evac-Opus-14B-Exp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Evac-Opus-14B-Exp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Evac-Opus-14B-Exp
8e7f3150f510d948d99f28aa5c5f98fd41e1777f
39.323055
apache-2.0
4
14.77
true
false
false
false
1.944294
0.591614
59.161359
0.647544
49.581751
0.42145
42.145015
0.388423
18.456376
0.472781
18.63099
0.531666
47.96284
false
false
2025-02-12
2025-02-16
1
prithivMLmods/Evac-Opus-14B-Exp (Merge)
prithivMLmods_FastThink-0.5B-Tiny_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/FastThink-0.5B-Tiny" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/FastThink-0.5B-Tiny</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__FastThink-0.5B-Tiny-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/FastThink-0.5B-Tiny
c07fd949ceba096d7c2e405bcfce99e269f7ca39
7.516955
apache-2.0
4
0.494
true
false
false
false
1.075075
0.257989
25.79888
0.320558
5.01961
0.020393
2.039275
0.260906
1.454139
0.356635
3.579427
0.164894
7.210402
false
false
2025-01-20
2025-01-24
1
prithivMLmods/FastThink-0.5B-Tiny (Merge)
prithivMLmods_GWQ-9B-Preview_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/GWQ-9B-Preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/GWQ-9B-Preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__GWQ-9B-Preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/GWQ-9B-Preview
5a0e00ac0ff885f54ef32e607508895bae864006
30.154536
gemma
3
9.242
true
false
false
false
4.922323
0.506584
50.658364
0.580575
40.669723
0.226586
22.65861
0.339765
11.96868
0.495104
21.821354
0.398354
33.150488
false
false
2025-01-04
2025-01-08
0
prithivMLmods/GWQ-9B-Preview
prithivMLmods_GWQ-9B-Preview2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/GWQ-9B-Preview2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/GWQ-9B-Preview2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__GWQ-9B-Preview2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/GWQ-9B-Preview2
42f5d4f7d19eb59c9408ff70cdbc30459ec1ad3d
30.047188
creativeml-openrail-m
10
9.242
true
false
false
false
4.905648
0.520897
52.089678
0.579722
40.184861
0.23716
23.716012
0.326342
10.178971
0.48599
20.815365
0.399684
33.298242
false
false
2025-01-04
2025-01-08
1
prithivMLmods/GWQ-9B-Preview2 (Merge)
prithivMLmods_GWQ2b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/GWQ2b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/GWQ2b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__GWQ2b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/GWQ2b
1d2a808ec30008a2cba697b1bb742ab67efb71f0
16.429712
gemma
4
2.614
true
false
false
false
2.408579
0.411487
41.148708
0.414337
17.68035
0.062689
6.268882
0.282718
4.362416
0.431115
12.75599
0.247257
16.361924
false
false
2025-01-09
2025-01-12
1
prithivMLmods/GWQ2b (Merge)
prithivMLmods_Gaea-Opus-14B-Exp_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Gaea-Opus-14B-Exp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Gaea-Opus-14B-Exp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Gaea-Opus-14B-Exp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Gaea-Opus-14B-Exp
3a4b38d8906d4eeafcc31601b08994e73eb75408
40.113808
apache-2.0
2
14.766
true
false
false
false
2.004068
0.595635
59.563514
0.656047
50.512294
0.427492
42.749245
0.39094
18.791946
0.485896
20.170313
0.54006
48.895538
false
false
2025-03-11
2025-03-13
1
prithivMLmods/Gaea-Opus-14B-Exp (Merge)
prithivMLmods_Galactic-Qwen-14B-Exp1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Galactic-Qwen-14B-Exp1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Galactic-Qwen-14B-Exp1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Galactic-Qwen-14B-Exp1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Galactic-Qwen-14B-Exp1
9e988a4a9bb65a420c511b23dbe0f09685c18e1f
39.469505
apache-2.0
2
14.766
true
false
false
false
2.103077
0.58322
58.32203
0.658226
50.989572
0.401813
40.181269
0.393456
19.127517
0.478052
19.35651
0.539561
48.84013
false
false
2025-03-10
2025-03-12
1
prithivMLmods/Galactic-Qwen-14B-Exp1 (Merge)
prithivMLmods_Galactic-Qwen-14B-Exp2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Galactic-Qwen-14B-Exp2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Galactic-Qwen-14B-Exp2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Galactic-Qwen-14B-Exp2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Galactic-Qwen-14B-Exp2
2afd2e6e33627c1241ae87f6710bfc0880285c7e
43.563718
apache-2.0
3
14.766
true
false
false
false
3.180329
0.66203
66.203008
0.7203
59.917317
0.347432
34.743202
0.399329
19.910515
0.535385
28.489844
0.569066
52.118425
false
false
2025-03-10
2025-03-13
1
prithivMLmods/Galactic-Qwen-14B-Exp2 (Merge)
prithivMLmods_Gauss-Opus-14B-R999_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Gauss-Opus-14B-R999" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Gauss-Opus-14B-R999</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Gauss-Opus-14B-R999-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Gauss-Opus-14B-R999
12db6077af849038d206775339a1dc78df9a14cf
38.802495
apache-2.0
1
14.77
true
false
false
false
1.881754
0.390655
39.065457
0.622783
44.936115
0.575529
57.55287
0.391779
18.903803
0.533833
27.829167
0.500748
44.527556
false
false
2025-03-03
2025-03-04
1
prithivMLmods/Gauss-Opus-14B-R999 (Merge)
prithivMLmods_Jolt-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Jolt-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Jolt-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Jolt-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Jolt-v0.1
077b9bf6b346af0a865d4d9b8618c8349a03b9c6
37.19436
apache-2.0
2
14.766
true
false
false
false
3.951695
0.509207
50.920668
0.652141
50.029742
0.356495
35.649547
0.380034
17.337808
0.484719
20.489844
0.538647
48.738549
false
false
2025-02-05
2025-02-07
1
prithivMLmods/Jolt-v0.1 (Merge)
prithivMLmods_Lacerta-Opus-14B-Elite8_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Lacerta-Opus-14B-Elite8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Lacerta-Opus-14B-Elite8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Lacerta-Opus-14B-Elite8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Lacerta-Opus-14B-Elite8
944dbb7ef1ed67b64bba1078531eb2611268d3f9
38.069826
apache-2.0
2
14.766
true
false
false
false
2.021558
0.614145
61.414491
0.640138
48.182387
0.364804
36.480363
0.378356
17.114094
0.463542
17.209375
0.532164
48.018248
false
false
2025-02-27
2025-02-27
1
prithivMLmods/Lacerta-Opus-14B-Elite8 (Merge)
prithivMLmods_Llama-3.1-5B-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Llama-3.1-5B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Llama-3.1-5B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Llama-3.1-5B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Llama-3.1-5B-Instruct
310ab744cd88aecedc534abd373d2f66a0c82f19
4.207174
llama3.1
1
5.413
true
false
false
false
0.999104
0.14066
14.066012
0.305107
3.109216
0.015106
1.510574
0.264262
1.901566
0.354
2.616667
0.118351
2.039007
false
false
2025-01-04
2025-01-12
0
prithivMLmods/Llama-3.1-5B-Instruct
prithivMLmods_Llama-3.1-8B-Open-SFT_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Llama-3.1-8B-Open-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Llama-3.1-8B-Open-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Llama-3.1-8B-Open-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Llama-3.1-8B-Open-SFT
e5d7fa281735f7fcc09fdb5810a2118789040d67
21.043704
creativeml-openrail-m
6
8.03
true
false
false
false
1.455588
0.412262
41.226169
0.496798
28.179928
0.121601
12.160121
0.309564
7.941834
0.390365
8.728906
0.352227
28.025266
false
false
2024-12-18
2025-01-12
1
prithivMLmods/Llama-3.1-8B-Open-SFT (Merge)
prithivMLmods_Llama-3.2-3B-Math-Oct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Llama-3.2-3B-Math-Oct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Llama-3.2-3B-Math-Oct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Llama-3.2-3B-Math-Oct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Llama-3.2-3B-Math-Oct
5d72ae9689eb8307a741c6e7a455e427a792cd15
17.441954
llama3.2
1
3.213
true
false
false
false
1.191365
0.458523
45.852338
0.437184
19.94675
0.115559
11.555891
0.258389
1.118568
0.34699
4.940365
0.29114
21.23781
false
false
2025-01-22
2025-01-24
1
prithivMLmods/Llama-3.2-3B-Math-Oct (Merge)
prithivMLmods_Llama-3.2-6B-AlgoCode_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Llama-3.2-6B-AlgoCode" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Llama-3.2-6B-AlgoCode</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Llama-3.2-6B-AlgoCode-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Llama-3.2-6B-AlgoCode
e111d34ff9033fe36b4f1c283a17d017b4e4e5c6
9.301
llama3.2
1
6.339
true
false
false
false
1.554721
0.213576
21.357554
0.374774
11.602526
0.013595
1.359517
0.286913
4.9217
0.401344
7.701302
0.179771
8.863401
false
false
2025-01-10
2025-01-12
0
prithivMLmods/Llama-3.2-6B-AlgoCode
prithivMLmods_Llama-8B-Distill-CoT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Llama-8B-Distill-CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Llama-8B-Distill-CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Llama-8B-Distill-CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Llama-8B-Distill-CoT
4c2d02c2cd92f4c371547201027202ac42d88a71
20.756374
llama3.1
5
8.03
true
false
false
false
1.435005
0.334151
33.415116
0.429762
19.595123
0.400302
40.030211
0.28943
5.257271
0.371979
6.997396
0.273188
19.243129
false
false
2025-01-21
2025-01-22
1
prithivMLmods/Llama-8B-Distill-CoT (Merge)
prithivMLmods_Llama-Deepsync-1B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Llama-Deepsync-1B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Llama-Deepsync-1B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Llama-Deepsync-1B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Llama-Deepsync-1B
03a9a38ffbb49f0f176a901a5fab3e444d6131fe
10.269419
creativeml-openrail-m
3
1.236
true
false
false
false
0.754274
0.357007
35.700719
0.338563
7.763873
0.043807
4.380665
0.260067
1.342282
0.35651
4.230469
0.173787
8.198508
false
false
2024-12-29
2025-01-12
1
prithivMLmods/Llama-Deepsync-1B (Merge)
prithivMLmods_Llama-Deepsync-3B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Llama-Deepsync-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Llama-Deepsync-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Llama-Deepsync-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Llama-Deepsync-3B
9f7c81f997f9a35797b511197e48a64ffb6d046f
17.176507
creativeml-openrail-m
9
3.213
true
false
false
false
1.216154
0.430222
43.022181
0.429152
18.963664
0.117825
11.782477
0.271812
2.908277
0.332385
3.814844
0.303108
22.567598
false
false
2024-12-29
2025-01-12
1
prithivMLmods/Llama-Deepsync-3B (Merge)
prithivMLmods_Llama-Express.1-Math_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Llama-Express.1-Math" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Llama-Express.1-Math</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Llama-Express.1-Math-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Llama-Express.1-Math
9c32d92f0ef3a4c4935992c9a5074d7a65ea91bc
12.170623
llama3.2
1
1.236
true
false
false
true
0.71209
0.508432
50.843207
0.336381
7.19902
0.055891
5.589124
0.263423
1.789709
0.314344
0.826302
0.160987
6.776374
false
false
2025-01-21
2025-01-25
1
prithivMLmods/Llama-Express.1-Math (Merge)
prithivMLmods_LwQ-10B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/LwQ-10B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/LwQ-10B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__LwQ-10B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/LwQ-10B-Instruct
3db52014aba9ec7163c28af47aac1f07af8fe0f6
20.967461
llama3.1
1
10.732
true
false
false
false
1.450351
0.393477
39.347709
0.512171
31.590273
0.04003
4.003021
0.312081
8.277405
0.454396
16.832813
0.331782
25.753546
false
false
2025-01-14
2025-01-19
1
prithivMLmods/LwQ-10B-Instruct (Merge)
prithivMLmods_LwQ-Reasoner-10B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/LwQ-Reasoner-10B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/LwQ-Reasoner-10B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__LwQ-Reasoner-10B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/LwQ-Reasoner-10B
fcd46007bd9f098004843dd79042a99543a22293
26.994378
llama3.1
2
10.306
true
false
false
false
1.789195
0.294134
29.413401
0.586625
40.337248
0.358006
35.800604
0.346477
12.863535
0.407854
8.581771
0.414727
34.96971
false
false
2025-01-18
2025-01-19
1
prithivMLmods/LwQ-Reasoner-10B (Merge)
prithivMLmods_Magellanic-Opus-14B-Exp_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Magellanic-Opus-14B-Exp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Magellanic-Opus-14B-Exp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Magellanic-Opus-14B-Exp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Magellanic-Opus-14B-Exp
64ae086663b4008fcd263a76f7d4360a50d9e81e
40.055124
apache-2.0
2
14.766
true
false
false
false
1.941225
0.686635
68.66348
0.638251
48.003324
0.379909
37.990937
0.374161
16.55481
0.492625
21.644792
0.527261
47.473404
false
false
2025-02-13
2025-02-14
1
prithivMLmods/Magellanic-Opus-14B-Exp (Merge)
prithivMLmods_Magellanic-Qwen-25B-R999_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Magellanic-Qwen-25B-R999" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Magellanic-Qwen-25B-R999</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Magellanic-Qwen-25B-R999-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Magellanic-Qwen-25B-R999
63ecd8209d9a194ec6b33c38c695323347b6b542
4.976527
0
24.962
false
false
false
false
2.40456
0.187272
18.727199
0.260757
2.003559
0.005287
0.528701
0.250839
0.111857
0.383115
5.15599
0.129987
3.331856
false
false
2025-03-04
0
Removed
prithivMLmods_Megatron-Corpus-14B-Exp_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Megatron-Corpus-14B-Exp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Megatron-Corpus-14B-Exp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Megatron-Corpus-14B-Exp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Megatron-Corpus-14B-Exp
1d1f2380140d36b460f3ed1d5de0b0edbd48b9a7
35.553957
apache-2.0
4
14.766
true
false
false
false
1.806247
0.498266
49.826571
0.635517
47.91898
0.3429
34.29003
0.363255
15.100671
0.476688
18.852604
0.526014
47.334885
false
false
2025-02-04
2025-02-13
1
prithivMLmods/Megatron-Corpus-14B-Exp (Merge)
prithivMLmods_Megatron-Corpus-14B-Exp.v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Megatron-Corpus-14B-Exp.v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Megatron-Corpus-14B-Exp.v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Megatron-Corpus-14B-Exp.v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Megatron-Corpus-14B-Exp.v2
5d7501a6e01875c268ad73343befd6a2906ccd14
31.898674
apache-2.0
2
14.766
true
false
false
false
3.538279
0.48705
48.704992
0.632146
46.788412
0.259063
25.906344
0.342282
12.304251
0.449
15.358333
0.480967
42.329713
false
false
2025-02-06
2025-02-07
1
prithivMLmods/Megatron-Corpus-14B-Exp.v2 (Merge)
prithivMLmods_Megatron-Opus-14B-2.0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Megatron-Opus-14B-2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Megatron-Opus-14B-2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Megatron-Opus-14B-2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Megatron-Opus-14B-2.0
c3ddba573bd07f5f233b28d1a5d95a8d25ba6e33
36.80518
llama3.1
3
14.66
true
false
false
true
0.88246
0.669374
66.937393
0.687056
54.699622
0.277946
27.794562
0.35906
14.541387
0.414031
10.520573
0.517038
46.337544
false
false
2025-02-08
2025-02-09
1
prithivMLmods/Megatron-Opus-14B-2.0 (Merge)
prithivMLmods_Megatron-Opus-14B-2.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Megatron-Opus-14B-2.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Megatron-Opus-14B-2.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Megatron-Opus-14B-2.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Megatron-Opus-14B-2.1
d22ca0d0df544e18ba758544016f8cd3c004da92
28.509697
llama3
4
14.66
true
false
false
false
0.855733
0.024555
2.455485
0.672696
52.531003
0.299849
29.984894
0.383389
17.785235
0.49275
21.927083
0.51737
46.374483
false
false
2025-02-18
2025-02-21
1
prithivMLmods/Megatron-Opus-14B-2.1 (Merge)
prithivMLmods_Megatron-Opus-14B-Exp_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Megatron-Opus-14B-Exp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Megatron-Opus-14B-Exp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Megatron-Opus-14B-Exp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Megatron-Opus-14B-Exp
d6c56465b7610abbebbf6cdedae6fda92087fbfc
36.964775
apache-2.0
3
14.766
true
false
false
false
3.828068
0.497941
49.794102
0.651609
50.002879
0.353474
35.347432
0.375
16.666667
0.488656
21.082031
0.54006
48.895538
false
false
2025-02-03
2025-02-03
1
prithivMLmods/Megatron-Opus-14B-Exp (Merge)
prithivMLmods_Megatron-Opus-14B-Stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Megatron-Opus-14B-Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Megatron-Opus-14B-Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Megatron-Opus-14B-Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Megatron-Opus-14B-Stock
a9d75a507fb0e9320e70c120b0f1823cc377cea2
36.31374
3
14.766
false
false
false
false
3.650521
0.517375
51.737501
0.641175
48.128851
0.334592
33.459215
0.375
16.666667
0.482021
20.185937
0.529338
47.70427
false
false
2025-02-03
2025-02-03
1
prithivMLmods/Megatron-Opus-14B-Stock (Merge)
prithivMLmods_Megatron-Opus-7B-Exp_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Megatron-Opus-7B-Exp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Megatron-Opus-7B-Exp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Megatron-Opus-7B-Exp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Megatron-Opus-7B-Exp
1856f046b2fe15ccf1baac686aa4595ab4245f86
27.617726
llama3.1
1
7.456
true
false
false
false
1.196807
0.60173
60.173008
0.536715
34.371535
0.19713
19.712991
0.311242
8.165548
0.418583
11.05625
0.390043
32.227024
false
false
2025-02-03
2025-02-03
0
prithivMLmods/Megatron-Opus-7B-Exp
prithivMLmods_Messier-Opus-14B-Elite7_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Messier-Opus-14B-Elite7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Messier-Opus-14B-Elite7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Messier-Opus-14B-Elite7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Messier-Opus-14B-Elite7
d8748c313cc0daa3e1112e953b2826404a0e577a
41.662772
apache-2.0
2
14.766
true
false
false
false
1.789413
0.711339
71.133925
0.649861
49.704671
0.4071
40.70997
0.39094
18.791946
0.488563
20.703646
0.540392
48.932476
false
false
2025-02-26
2025-02-27
1
prithivMLmods/Messier-Opus-14B-Elite7 (Merge)
prithivMLmods_Omni-Reasoner-Merged_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Omni-Reasoner-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Omni-Reasoner-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Omni-Reasoner-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Omni-Reasoner-Merged
5c34ad1b2510c510025ac724a16bed7f5ae5f1c3
29.234865
5
7.616
false
false
false
false
1.262832
0.459947
45.994738
0.550785
35.361777
0.333082
33.308157
0.303691
7.158837
0.461646
16.205729
0.43642
37.37995
false
false
2025-01-16
2025-01-17
1
prithivMLmods/Omni-Reasoner-Merged (Merge)
prithivMLmods_Omni-Reasoner3-Merged_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Omni-Reasoner3-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Omni-Reasoner3-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Omni-Reasoner3-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Omni-Reasoner3-Merged
a8fbe5740e04a78661dedd16597fa4d5a135ad95
18.433737
1
3.213
false
false
false
false
1.1766
0.49347
49.346955
0.438785
20.586522
0.108761
10.876133
0.264262
1.901566
0.352229
6.228646
0.294963
21.662603
false
false
2025-01-17
2025-01-17
1
prithivMLmods/Omni-Reasoner3-Merged (Merge)
prithivMLmods_Pegasus-Opus-14B-Exp_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Pegasus-Opus-14B-Exp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Pegasus-Opus-14B-Exp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Pegasus-Opus-14B-Exp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Pegasus-Opus-14B-Exp
7587dd36f6780374f07a0e57433a7f1cd382381e
41.62328
apache-2.0
4
14.766
true
false
false
false
1.878115
0.698175
69.817529
0.654755
50.306949
0.40861
40.861027
0.395134
19.35123
0.485958
20.378125
0.541223
49.024823
false
false
2025-02-26
2025-03-01
1
prithivMLmods/Pegasus-Opus-14B-Exp (Merge)
prithivMLmods_Phi-4-Empathetic_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Phi-4-Empathetic" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Phi-4-Empathetic</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Phi-4-Empathetic-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Phi-4-Empathetic
181a87cfc05f0ee538b14cf4a773ad3b816224fe
28.208397
mit
8
14.66
true
false
false
false
1.795276
0.049659
4.965935
0.672682
52.838938
0.262085
26.208459
0.380034
17.337808
0.499135
22.72526
0.506566
45.17398
false
false
2025-01-10
2025-01-12
1
prithivMLmods/Phi-4-Empathetic (Merge)
prithivMLmods_Phi-4-Math-IO_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Phi-4-Math-IO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Phi-4-Math-IO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Phi-4-Math-IO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Phi-4-Math-IO
2e3f81b0c1613d33a4b0e216120fa3a3dd9206f8
31.821783
mit
4
14.66
true
false
false
false
1.933893
0.058977
5.897685
0.666826
52.093771
0.457704
45.770393
0.39849
19.798658
0.487292
20.644792
0.520529
46.725399
false
false
2025-01-10
2025-01-12
1
prithivMLmods/Phi-4-Math-IO (Merge)
prithivMLmods_Phi-4-QwQ_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Phi-4-QwQ" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Phi-4-QwQ</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Phi-4-QwQ-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Phi-4-QwQ
f9d9cc11a7c9e56420b705ac97f06362321dd89a
31.262675
mit
8
14.66
true
false
false
false
1.971529
0.055929
5.592938
0.669557
52.28685
0.457704
45.770393
0.39094
18.791946
0.465063
17.632813
0.52751
47.501108
false
false
2025-01-10
2025-01-12
1
prithivMLmods/Phi-4-QwQ (Merge)
prithivMLmods_Phi-4-Super_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Phi-4-Super" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Phi-4-Super</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Phi-4-Super-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Phi-4-Super
d0632dd9df3d6a8ae4f10f2185d38eeb61cab9d2
30.387371
2
14.66
false
false
false
false
1.925291
0.048136
4.813561
0.672012
52.697295
0.348943
34.89426
0.394295
19.239374
0.504375
23.280208
0.526596
47.399527
false
false
2025-01-23
2025-01-24
1
prithivMLmods/Phi-4-Super (Merge)
prithivMLmods_Phi-4-Super-1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Phi-4-Super-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Phi-4-Super-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Phi-4-Super-1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Phi-4-Super-1
081e3442df878853ab8bd765430c961658ce5024
30.230267
6
14.66
false
false
false
false
1.871196
0.041766
4.176585
0.672934
52.905831
0.351964
35.196375
0.393456
19.127517
0.50174
22.917448
0.523521
47.057846
false
false
2025-01-24
2025-01-24
1
prithivMLmods/Phi-4-Super-1 (Merge)
prithivMLmods_Phi-4-Super-o1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Phi-4-Super-o1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Phi-4-Super-o1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Phi-4-Super-o1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Phi-4-Super-o1
081e3442df878853ab8bd765430c961658ce5024
30.230267
6
14.66
false
false
false
false
1.928305
0.041766
4.176585
0.672934
52.905831
0.351964
35.196375
0.393456
19.127517
0.50174
22.917448
0.523521
47.057846
false
false
2025-01-24
2025-01-24
1
prithivMLmods/Phi-4-Super-o1 (Merge)
prithivMLmods_Phi-4-o1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Phi-4-o1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Phi-4-o1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Phi-4-o1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Phi-4-o1
aa2a7571e9dbce0fefe98479fe04f298f2491b8c
30.20429
mit
22
14.66
true
false
false
false
1.738167
0.028976
2.897645
0.668873
52.170862
0.399547
39.954683
0.38255
17.673378
0.497771
22.154687
0.51737
46.374483
false
false
2025-01-08
2025-01-09
1
prithivMLmods/Phi-4-o1 (Merge)
prithivMLmods_Phi4-Super_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Phi4-Super" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Phi4-Super</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Phi4-Super-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Phi4-Super
d27188b144a6ac8c2d70f761e8afd8b05c74fd16
30.387371
2
14.66
false
false
false
false
1.837064
0.048136
4.813561
0.672012
52.697295
0.348943
34.89426
0.394295
19.239374
0.504375
23.280208
0.526596
47.399527
false
false
2025-01-23
2025-01-23
1
prithivMLmods/Phi4-Super (Merge)
prithivMLmods_Porpoise-Opus-14B-Exp_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Porpoise-Opus-14B-Exp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Porpoise-Opus-14B-Exp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Porpoise-Opus-14B-Exp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Porpoise-Opus-14B-Exp
5b35f2520b75aceb722cc54f2bdc27f70b5fd140
41.769424
apache-2.0
2
14.766
true
false
false
false
1.928898
0.709816
70.981551
0.65189
49.946613
0.404079
40.407855
0.393456
19.127517
0.492563
21.303646
0.539644
48.849365
false
false
2025-02-26
2025-02-27
1
prithivMLmods/Porpoise-Opus-14B-Exp (Merge)
prithivMLmods_Primal-Opus-14B-Optimus-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Primal-Opus-14B-Optimus-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Primal-Opus-14B-Optimus-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Primal-Opus-14B-Optimus-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Primal-Opus-14B-Optimus-v1
240fd09db4f126801d50bd74a74700927918c2d4
36.064412
apache-2.0
4
14.766
true
false
false
false
3.978811
0.501313
50.131318
0.641942
48.271703
0.338369
33.836858
0.372483
16.331096
0.484719
20.489844
0.525931
47.32565
false
false
2025-02-02
2025-02-03
1
prithivMLmods/Primal-Opus-14B-Optimus-v1 (Merge)
prithivMLmods_Primal-Opus-14B-Optimus-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Primal-Opus-14B-Optimus-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Primal-Opus-14B-Optimus-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Primal-Opus-14B-Optimus-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Primal-Opus-14B-Optimus-v2
fef7d92fabb736447402b52f0f4ccc932b75fecc
40.912713
apache-2.0
4
14.766
true
false
false
false
1.94029
0.640373
64.03731
0.654378
50.181344
0.420695
42.069486
0.391779
18.903803
0.48999
21.148698
0.542221
49.135638
false
false
2025-02-27
2025-02-27
1
prithivMLmods/Primal-Opus-14B-Optimus-v2 (Merge)
prithivMLmods_QwQ-LCoT-14B-Conversational_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/QwQ-LCoT-14B-Conversational" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/QwQ-LCoT-14B-Conversational</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__QwQ-LCoT-14B-Conversational-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/QwQ-LCoT-14B-Conversational
60ef4aa0a2660f9b6f28a3de773729969a1df9ae
35.683067
apache-2.0
4
14.77
true
false
false
false
3.908904
0.404743
40.474275
0.623983
45.62626
0.465257
46.52568
0.349832
13.310962
0.484719
20.623177
0.527842
47.538047
false
false
2025-01-18
2025-01-19
1
prithivMLmods/QwQ-LCoT-14B-Conversational (Merge)
prithivMLmods_QwQ-LCoT-3B-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/QwQ-LCoT-3B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/QwQ-LCoT-3B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__QwQ-LCoT-3B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/QwQ-LCoT-3B-Instruct
1f47223ac1c6069c3e53b75a45ad496f0fb9a124
24.021307
creativeml-openrail-m
4
3.086
true
false
false
false
1.530787
0.435442
43.54424
0.476298
26.621188
0.282477
28.247734
0.281879
4.250559
0.435792
12.773958
0.358211
28.69016
false
false
2024-12-12
2025-01-12
1
prithivMLmods/QwQ-LCoT-3B-Instruct (Merge)
prithivMLmods_QwQ-LCoT-7B-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/QwQ-LCoT-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/QwQ-LCoT-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__QwQ-LCoT-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/QwQ-LCoT-7B-Instruct
06f0076fcf5cb72222513e6c76bd33e1ebaa97b7
30.8638
creativeml-openrail-m
17
7.616
true
false
false
false
1.300609
0.49869
49.869014
0.546647
34.780933
0.371601
37.160121
0.302013
6.935123
0.480188
19.390104
0.433428
37.047503
false
false
2024-12-14
2025-01-07
1
prithivMLmods/QwQ-LCoT-7B-Instruct (Merge)
prithivMLmods_QwQ-LCoT1-Merged_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/QwQ-LCoT1-Merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/QwQ-LCoT1-Merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__QwQ-LCoT1-Merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/QwQ-LCoT1-Merged
d85a4f359bc568afb7b1a2a6e6503934bb352ab6
30.445291
2
7.616
false
false
false
false
1.310759
0.475135
47.513486
0.548096
35.166254
0.373112
37.311178
0.307047
7.606264
0.469615
17.76849
0.435755
37.306073
false
false
2025-01-21
2025-01-22
1
prithivMLmods/QwQ-LCoT1-Merged (Merge)
prithivMLmods_QwQ-LCoT2-7B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/QwQ-LCoT2-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/QwQ-LCoT2-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__QwQ-LCoT2-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/QwQ-LCoT2-7B-Instruct
f2ea462f6d3f6cf104313b1329909cb15a388841
30.32393
apache-2.0
5
7.616
true
false
false
false
2.051162
0.556118
55.611777
0.542486
34.366737
0.327039
32.703927
0.297819
6.375839
0.456438
15.754688
0.434176
37.130615
false
false
2025-01-20
2025-01-24
1
prithivMLmods/QwQ-LCoT2-7B-Instruct (Merge)
prithivMLmods_QwQ-MathOct-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/QwQ-MathOct-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/QwQ-MathOct-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__QwQ-MathOct-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/QwQ-MathOct-7B
d2ff038987cc16a7b317034929dd9ab35265e308
28.497759
apache-2.0
2
7.616
true
false
false
false
1.329046
0.46844
46.84404
0.548551
35.254667
0.295317
29.531722
0.302852
7.04698
0.460063
15.307812
0.433012
37.00133
false
false
2025-01-11
2025-01-19
1
prithivMLmods/QwQ-MathOct-7B (Merge)
prithivMLmods_QwQ-R1-Distill-1.5B-CoT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/QwQ-R1-Distill-1.5B-CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/QwQ-R1-Distill-1.5B-CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__QwQ-R1-Distill-1.5B-CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/QwQ-R1-Distill-1.5B-CoT
cd1a92a4fffbc923013e2a77d9d7f2c8b2a738ae
13.931651
apache-2.0
4
1.777
true
false
false
false
1.174832
0.219396
21.939565
0.366621
11.476456
0.334592
33.459215
0.286074
4.809843
0.343396
1.757812
0.191323
10.147015
false
false
2025-01-21
2025-01-22
1
prithivMLmods/QwQ-R1-Distill-1.5B-CoT (Merge)
prithivMLmods_QwQ-R1-Distill-7B-CoT_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/QwQ-R1-Distill-7B-CoT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/QwQ-R1-Distill-7B-CoT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__QwQ-R1-Distill-7B-CoT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/QwQ-R1-Distill-7B-CoT
db0c74ffe611d00eb0a5df4413f3eced7fdacb78
22.192243
apache-2.0
4
7.616
true
false
false
false
1.342241
0.350038
35.00379
0.438789
20.953831
0.468278
46.827795
0.293624
5.816555
0.377906
4.504948
0.280419
20.046543
false
false
2025-01-21
2025-01-22
1
prithivMLmods/QwQ-R1-Distill-7B-CoT (Merge)
prithivMLmods_Qwen-7B-Distill-Reasoner_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Qwen-7B-Distill-Reasoner" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Qwen-7B-Distill-Reasoner</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Qwen-7B-Distill-Reasoner-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Qwen-7B-Distill-Reasoner
b83c5c3d748f756927b87ae978f94fdb033c526b
21.484736
apache-2.0
2
7.616
true
false
false
false
1.320808
0.339571
33.957123
0.440933
22.175998
0.395015
39.501511
0.327181
10.290828
0.365969
2.779427
0.281832
20.203531
false
false
2025-01-28
2025-01-28
1
prithivMLmods/Qwen-7B-Distill-Reasoner (Merge)
prithivMLmods_Qwen2.5-1.5B-DeepSeek-R1-Instruct_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Qwen2.5-1.5B-DeepSeek-R1-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Qwen2.5-1.5B-DeepSeek-R1-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Qwen2.5-1.5B-DeepSeek-R1-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Qwen2.5-1.5B-DeepSeek-R1-Instruct
ca8cf376e59e873d70f8b9dffcb19aecc9d32fab
4.06773
1
1.777
false
false
false
false
1.221616
0.139686
13.968603
0.282437
1.361067
0
0
0.276007
3.467562
0.372354
4.244271
0.112284
1.364879
false
false
2025-01-29
2025-01-29
1
prithivMLmods/Qwen2.5-1.5B-DeepSeek-R1-Instruct (Merge)
prithivMLmods_Qwen2.5-14B-DeepSeek-R1-1M_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Qwen2.5-14B-DeepSeek-R1-1M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Qwen2.5-14B-DeepSeek-R1-1M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Qwen2.5-14B-DeepSeek-R1-1M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Qwen2.5-14B-DeepSeek-R1-1M
bc7898d09ac620cf86afa3237daa8181f689345b
34.333865
5
14.77
false
false
false
true
6.192776
0.419281
41.928084
0.593485
40.75991
0.51284
51.283988
0.332215
10.961969
0.460604
17.742188
0.489943
43.327054
false
false
2025-01-29
2025-02-01
1
prithivMLmods/Qwen2.5-14B-DeepSeek-R1-1M (Merge)
prithivMLmods_Qwen2.5-7B-DeepSeek-R1-1M_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Qwen2.5-7B-DeepSeek-R1-1M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Qwen2.5-7B-DeepSeek-R1-1M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Qwen2.5-7B-DeepSeek-R1-1M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Qwen2.5-7B-DeepSeek-R1-1M
a42acdfe01bd887dde308deeb07d570979976838
5.383076
3
7.616
false
false
false
false
1.322782
0.186123
18.612282
0.312555
4.665735
0.015106
1.510574
0.261745
1.565996
0.341688
3.710937
0.120096
2.232934
false
false
2025-01-29
2025-01-29
1
prithivMLmods/Qwen2.5-7B-DeepSeek-R1-1M (Merge)
prithivMLmods_SmolLM2-CoT-360M_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/SmolLM2-CoT-360M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/SmolLM2-CoT-360M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__SmolLM2-CoT-360M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/SmolLM2-CoT-360M
474240d772fbb3b8da6f8eb47f32dd34c6b78baf
5.950748
apache-2.0
9
0.362
true
false
false
false
0.775503
0.221569
22.156877
0.31353
4.801205
0.020393
2.039275
0.236577
0
0.379396
5.757813
0.108544
0.94932
false
false
2025-01-05
2025-01-07
1
prithivMLmods/SmolLM2-CoT-360M (Merge)
prithivMLmods_Sombrero-Opus-14B-Elite5_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Sombrero-Opus-14B-Elite5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Sombrero-Opus-14B-Elite5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Sombrero-Opus-14B-Elite5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Sombrero-Opus-14B-Elite5
f38f6b061fe97d1ad8b30aa02a1b6e18a05a7569
42.323328
apache-2.0
3
14.766
true
false
false
true
1.575029
0.788076
78.807564
0.650154
50.174647
0.535498
53.549849
0.336409
11.521253
0.428667
13.216667
0.52003
46.669991
false
false
2025-02-12
2025-02-13
1
prithivMLmods/Sombrero-Opus-14B-Elite5 (Merge)
prithivMLmods_Sombrero-Opus-14B-Elite6_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Sombrero-Opus-14B-Elite6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Sombrero-Opus-14B-Elite6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Sombrero-Opus-14B-Elite6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Sombrero-Opus-14B-Elite6
bdadcbf346ce8cc10353ba5b8019487e1b982b08
41.880845
apache-2.0
2
14.766
true
false
false
false
3.495365
0.722605
72.260491
0.648794
49.595191
0.407855
40.785498
0.393456
19.127517
0.488594
20.740885
0.538979
48.775488
false
false
2025-02-22
2025-02-25
1
prithivMLmods/Sombrero-Opus-14B-Elite6 (Merge)
prithivMLmods_Sombrero-Opus-14B-Sm1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Sombrero-Opus-14B-Sm1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Sombrero-Opus-14B-Sm1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Sombrero-Opus-14B-Sm1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Sombrero-Opus-14B-Sm1
ada12c7016fdc48547b14e15f415791afec55f8f
39.22382
apache-2.0
2
14.77
true
false
false
false
1.814819
0.381287
38.128721
0.635462
47.031254
0.566465
56.646526
0.403523
20.469799
0.529896
27.236979
0.512467
45.829639
false
false
2025-03-06
2025-03-08
1
prithivMLmods/Sombrero-Opus-14B-Sm1 (Merge)
prithivMLmods_Sombrero-Opus-14B-Sm2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Sombrero-Opus-14B-Sm2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Sombrero-Opus-14B-Sm2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Sombrero-Opus-14B-Sm2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Sombrero-Opus-14B-Sm2
a676ec7bf723584225f276468bd8ebd5633de464
38.980475
apache-2.0
2
14.77
true
false
false
false
1.729117
0.427224
42.722421
0.660937
51.251858
0.486405
48.640483
0.388423
18.456376
0.508813
24.534896
0.534491
48.276817
false
false
2025-03-06
2025-03-08
1
prithivMLmods/Sombrero-Opus-14B-Sm2 (Merge)
prithivMLmods_Sombrero-Opus-14B-Sm4_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Sombrero-Opus-14B-Sm4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Sombrero-Opus-14B-Sm4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Sombrero-Opus-14B-Sm4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Sombrero-Opus-14B-Sm4
ba5412590ee35408c9868c3523f9a96a9f891ceb
39.385451
apache-2.0
3
14.77
true
false
false
false
1.726249
0.434693
43.469328
0.661278
51.15996
0.487915
48.791541
0.395134
19.35123
0.519167
25.7625
0.530003
47.778147
false
false
2025-03-06
2025-03-12
1
prithivMLmods/Sombrero-Opus-14B-Sm4 (Merge)
prithivMLmods_Sombrero-Opus-14B-Sm5_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Sombrero-Opus-14B-Sm5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Sombrero-Opus-14B-Sm5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Sombrero-Opus-14B-Sm5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Sombrero-Opus-14B-Sm5
f3994069ea18d5000c4f47e3a013bf57c0c9e338
41.11318
apache-2.0
2
14.766
true
false
false
false
1.938348
0.685161
68.516093
0.656394
50.596006
0.409366
40.936556
0.386745
18.232662
0.480625
19.511458
0.539977
48.886303
false
false
2025-03-06
2025-03-08
1
prithivMLmods/Sombrero-Opus-14B-Sm5 (Merge)
prithivMLmods_Sqweeks-7B-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Sqweeks-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Sqweeks-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Sqweeks-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Sqweeks-7B-Instruct
3806238f5f13f425ac429c41530adb0148b6881e
23.920659
apache-2.0
3
7.616
true
false
false
false
2.005205
0.215799
21.579853
0.466669
24.98215
0.51435
51.435045
0.307047
7.606264
0.447604
14.217188
0.313331
23.703457
false
false
2025-02-05
2025-02-07
1
prithivMLmods/Sqweeks-7B-Instruct (Merge)
prithivMLmods_Tadpole-Opus-14B-Exp_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Tadpole-Opus-14B-Exp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Tadpole-Opus-14B-Exp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Tadpole-Opus-14B-Exp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Tadpole-Opus-14B-Exp
9520a4148e858334a4f6ca7424307f95b0980b06
36.878693
apache-2.0
1
14.766
true
false
false
false
2.083377
0.574952
57.495224
0.636859
47.778764
0.313444
31.344411
0.385906
18.120805
0.472844
18.505469
0.532247
48.027482
false
false
2025-02-25
2025-02-26
1
prithivMLmods/Tadpole-Opus-14B-Exp (Merge)
prithivMLmods_Taurus-Opus-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Taurus-Opus-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Taurus-Opus-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Taurus-Opus-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Taurus-Opus-7B
4b9918fb7ed2a92bdb1beae11deb337a3745d053
25.88865
apache-2.0
4
7.456
true
false
false
false
1.36362
0.422328
42.232831
0.536736
34.234016
0.216767
21.676737
0.326342
10.178971
0.439885
14.21901
0.395113
32.790337
false
false
2025-01-25
2025-01-27
1
prithivMLmods/Taurus-Opus-7B (Merge)
prithivMLmods_Triangulum-10B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Triangulum-10B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Triangulum-10B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Triangulum-10B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Triangulum-10B
d3776fbe6bfc884f1380fe128223759d76214049
28.300666
llama3.1
4
10.306
true
false
false
false
1.718828
0.322935
32.293537
0.596802
42.240747
0.354985
35.498489
0.354027
13.870246
0.41725
10.589583
0.417803
35.311392
false
false
2024-12-30
2025-01-07
1
prithivMLmods/Triangulum-10B (Merge)
prithivMLmods_Triangulum-5B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Triangulum-5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Triangulum-5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Triangulum-5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Triangulum-5B
55e161fc171b17b3e6c15aef9d5318a51bdb48fb
4.011792
creativeml-openrail-m
2
5.413
true
false
false
false
0.984858
0.128321
12.832063
0.312412
4.293502
0.010574
1.057402
0.255034
0.671141
0.344542
2.734375
0.12234
2.48227
false
false
2024-12-31
2025-01-07
1
prithivMLmods/Triangulum-5B (Merge)
prithivMLmods_Triangulum-v2-10B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Triangulum-v2-10B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Triangulum-v2-10B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Triangulum-v2-10B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Triangulum-v2-10B
a23407caf6f232d305c5cdf1c802dfa430e57915
32.833839
llama3.1
1
10.306
true
false
false
false
1.815262
0.670523
67.05231
0.606453
42.754726
0.244713
24.471299
0.337248
11.63311
0.428073
12.575781
0.446642
38.51581
false
false
2025-01-30
2025-01-31
0
prithivMLmods/Triangulum-v2-10B
prithivMLmods_Tucana-Opus-14B-r999_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Tucana-Opus-14B-r999" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Tucana-Opus-14B-r999</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Tucana-Opus-14B-r999-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Tucana-Opus-14B-r999
b4c9f74143dd04e9eee46b3cae63cb066cce0e69
39.750666
apache-2.0
3
14.77
true
false
false
false
1.995423
0.606726
60.672571
0.655689
50.586762
0.406344
40.634441
0.391779
18.903803
0.473031
18.995573
0.538398
48.710845
false
false
2025-02-28
2025-03-01
1
prithivMLmods/Tucana-Opus-14B-r999 (Merge)
prithivMLmods_Tulu-MathLingo-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Tulu-MathLingo-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Tulu-MathLingo-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Tulu-MathLingo-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Tulu-MathLingo-8B
0fb551a24dfe1a576e2c5118a7581588d339a2e7
21.797792
creativeml-openrail-m
3
8.03
true
false
false
false
1.683148
0.55894
55.894028
0.465881
24.703351
0.145015
14.501511
0.290268
5.369128
0.386427
7.603385
0.304438
22.715352
false
false
2024-12-23
2025-01-12
1
prithivMLmods/Tulu-MathLingo-8B (Merge)
prithivMLmods_Viper-Coder-7B-Elite14_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Viper-Coder-7B-Elite14" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Viper-Coder-7B-Elite14</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Viper-Coder-7B-Elite14-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Viper-Coder-7B-Elite14
514675679f50691831cd28636c8f862a4004663e
3.487136
0
7.616
false
false
false
false
0.679944
0.148828
14.882844
0.282854
1.788972
0.010574
1.057402
0.255034
0.671141
0.342156
1.536198
0.108876
0.986259
false
false
2025-03-13
0
Removed
prithivMLmods_Viper-Coder-Hybrid-v1.2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Viper-Coder-Hybrid-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Viper-Coder-Hybrid-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Viper-Coder-Hybrid-v1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Viper-Coder-Hybrid-v1.2
890d476f14cce4a4850f3207ab9fc701e32c11ed
38.825475
apache-2.0
2
14.766
true
false
false
false
1.955571
0.673571
67.357057
0.639075
48.286402
0.333082
33.308157
0.374161
16.55481
0.482177
20.305469
0.524269
47.140957
false
false
2025-02-20
2025-02-21
1
prithivMLmods/Viper-Coder-Hybrid-v1.2 (Merge)
prithivMLmods_Viper-Coder-Hybrid-v1.3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Viper-Coder-Hybrid-v1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Viper-Coder-Hybrid-v1.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Viper-Coder-Hybrid-v1.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Viper-Coder-Hybrid-v1.3
a296ce2b8ee219ea185c917fa907560516e18f3e
40.411993
apache-2.0
8
14.766
true
false
false
true
1.798111
0.755478
75.547769
0.6471
49.614467
0.451662
45.166163
0.338087
11.744966
0.440323
14.873698
0.509724
45.524897
false
false
2025-02-21
2025-02-22
1
prithivMLmods/Viper-Coder-Hybrid-v1.3 (Merge)
prithivMLmods_Viper-Coder-HybridMini-v1.3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Viper-Coder-HybridMini-v1.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Viper-Coder-HybridMini-v1.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Viper-Coder-HybridMini-v1.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Viper-Coder-HybridMini-v1.3
c18d54cc4923f68958b4f32d3970d650b250c9cb
33.800749
apache-2.0
6
7.616
true
false
false
true
0.659516
0.610373
61.03727
0.536547
33.666954
0.462991
46.299094
0.317114
8.948546
0.45049
15.611198
0.435173
37.24143
false
false
2025-02-21
2025-02-22
1
prithivMLmods/Viper-Coder-HybridMini-v1.3 (Merge)
prithivMLmods_Viper-Coder-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Viper-Coder-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Viper-Coder-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Viper-Coder-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Viper-Coder-v0.1
39a88521fbb3b1af13922748004af131e8382c81
31.996466
apache-2.0
1
14.766
true
false
false
false
3.34892
0.552146
55.214608
0.614306
44.627257
0.327039
32.703927
0.354027
13.870246
0.439448
13.03099
0.392786
32.531767
false
false
2025-02-06
2025-02-07
1
prithivMLmods/Viper-Coder-v0.1 (Merge)
prithivMLmods_Viper-Coder-v1.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Viper-Coder-v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Viper-Coder-v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Viper-Coder-v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Viper-Coder-v1.1
7c0e6e2ee4684509ae4de063443259b8102fd979
40.260261
apache-2.0
7
14.77
true
false
false
false
1.81638
0.443236
44.323617
0.649229
49.26801
0.546073
54.607251
0.401007
20.134228
0.521927
26.207552
0.523188
47.020907
false
false
2025-02-13
2025-02-14
1
prithivMLmods/Viper-Coder-v1.1 (Merge)
prithivMLmods_Viper-Coder-v1.6-r999_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Viper-Coder-v1.6-r999" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Viper-Coder-v1.6-r999</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Viper-Coder-v1.6-r999-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Viper-Coder-v1.6-r999
9ade17c8100f0468211214f8c118e17a67a325fb
40.588383
apache-2.0
5
14.77
true
false
false
false
1.753886
0.443286
44.328604
0.649229
49.26801
0.56571
56.570997
0.401007
20.134228
0.521927
26.207552
0.523188
47.020907
false
false
2025-03-01
2025-03-01
1
prithivMLmods/Viper-Coder-v1.6-r999 (Merge)
prithivMLmods_Viper-Coder-v1.7-Vsm6_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Viper-Coder-v1.7-Vsm6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Viper-Coder-v1.7-Vsm6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Viper-Coder-v1.7-Vsm6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Viper-Coder-v1.7-Vsm6
8ffaf8fc58e0d330bdf65baf290a4512dc7befad
38.682881
apache-2.0
3
14.766
true
false
false
false
1.631944
0.500389
50.038897
0.650234
49.53325
0.464502
46.450151
0.396812
19.574944
0.47675
18.860417
0.528757
47.639628
false
false
2025-03-06
2025-03-07
1
prithivMLmods/Viper-Coder-v1.7-Vsm6 (Merge)
prithivMLmods_Viper-OneCoder-UIGEN_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Viper-OneCoder-UIGEN" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Viper-OneCoder-UIGEN</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Viper-OneCoder-UIGEN-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Viper-OneCoder-UIGEN
bb9bce0cd291dda05c9cbf7dfd66554ac10a608f
31.347294
apache-2.0
1
14.77
true
false
false
false
1.964401
0.46919
46.918953
0.604651
42.732153
0.386707
38.670695
0.342282
12.304251
0.451417
15.19375
0.390376
32.263963
false
false
2025-02-26
2025-02-27
1
prithivMLmods/Viper-OneCoder-UIGEN (Merge)
prithivMLmods_Volans-Opus-14B-Exp_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/Volans-Opus-14B-Exp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/Volans-Opus-14B-Exp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__Volans-Opus-14B-Exp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/Volans-Opus-14B-Exp
57d238100de4e1ef2456b17181a4b9db54029664
39.707075
apache-2.0
2
14.766
true
false
false
false
2.064902
0.586768
58.676755
0.652121
49.914267
0.425227
42.522659
0.385067
18.008949
0.487198
20.39974
0.538481
48.72008
false
false
2025-03-11
2025-03-13
1
prithivMLmods/Volans-Opus-14B-Exp (Merge)
prithivMLmods_WebMind-7B-v0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/prithivMLmods/WebMind-7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">prithivMLmods/WebMind-7B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/prithivMLmods__WebMind-7B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
prithivMLmods/WebMind-7B-v0.1
4016b7b6151142622bab81d805054ff4b4d41ff9
30.805047
apache-2.0
1
7.616
true
false
false
false
1.212123
0.527816
52.781619
0.543356
35.064291
0.364804
36.480363
0.317114
8.948546
0.45374
15.117448
0.427942
36.438017
false
false
2025-02-06
2025-02-07
1
prithivMLmods/WebMind-7B-v0.1 (Merge)
pszemraj_Llama-3-6.3b-v0.1_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/pszemraj/Llama-3-6.3b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pszemraj/Llama-3-6.3b-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pszemraj__Llama-3-6.3b-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pszemraj/Llama-3-6.3b-v0.1
7000b39346162f95f19aa4ca3975242db61902d7
10.384307
llama3
6
6.3
true
false
false
false
1.628927
0.10439
10.438969
0.419681
18.679996
0.021148
2.114804
0.283557
4.474273
0.390833
6.154167
0.283993
20.443632
false
false
2024-05-17
2024-06-26
1
meta-llama/Meta-Llama-3-8B
pszemraj_Mistral-v0.3-6B_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/pszemraj/Mistral-v0.3-6B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">pszemraj/Mistral-v0.3-6B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/pszemraj__Mistral-v0.3-6B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
pszemraj/Mistral-v0.3-6B
ae11a699012b83996361f04808f4d45debf3b01c
10.122379
apache-2.0
1
5.939
true
false
false
false
1.061079
0.245374
24.53745
0.377405
13.515091
0.013595
1.359517
0.265101
2.013423
0.390771
6.613021
0.214262
12.695774
false
false
2024-05-25
2024-06-26
2
pszemraj/Mistral-7B-v0.3-prune6 (Merge)
qingy2019_LLaMa_3.2_3B_Catalysts_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2019/LLaMa_3.2_3B_Catalysts" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/LLaMa_3.2_3B_Catalysts</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__LLaMa_3.2_3B_Catalysts-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2019/LLaMa_3.2_3B_Catalysts
3f4a318114beb37f32a2c143cbd68b6d15d18164
19.930931
apache-2.0
1
3
true
false
false
false
1.299668
0.49924
49.923979
0.446813
21.345401
0.129154
12.915408
0.288591
5.145414
0.378771
7.946354
0.300781
22.309028
false
false
2024-10-19
2024-10-29
2
meta-llama/Llama-3.2-3B-Instruct
qingy2019_OpenMath2-Llama3.1-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2019/OpenMath2-Llama3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/OpenMath2-Llama3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__OpenMath2-Llama3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2019/OpenMath2-Llama3.1-8B
38412f988f7688d884c9249b2a4e5cc76f98c1c6
12.751665
0
8
false
false
false
false
1.385613
0.233059
23.305939
0.409552
16.29437
0.267372
26.73716
0.265101
2.013423
0.343552
2.010677
0.155336
6.148419
false
false
2024-11-23
0
Removed
qingy2019_Oracle-14B_float16
float16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2019/Oracle-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/Oracle-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__Oracle-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2019/Oracle-14B
0154031aa9306aa98da156a0f3c8e10d9f1377f6
13.34025
0
13.668
false
false
false
false
1.393024
0.235832
23.583204
0.461158
23.18463
0.064199
6.41994
0.25755
1.006711
0.371667
10.491667
0.238198
15.355349
false
false
2024-11-23
0
Removed
qingy2019_Oracle-14B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2019/Oracle-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/Oracle-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__Oracle-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2019/Oracle-14B
0154031aa9306aa98da156a0f3c8e10d9f1377f6
13.593017
0
13.668
false
false
false
false
2.737775
0.240079
24.007855
0.46223
23.301946
0.072508
7.250755
0.260906
1.454139
0.370333
10.225
0.237866
15.31841
false
false
2024-11-24
0
Removed
qingy2019_Qwen2.5-Math-14B-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2019/Qwen2.5-Math-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/Qwen2.5-Math-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__Qwen2.5-Math-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2019/Qwen2.5-Math-14B-Instruct
025d9637208b862c7b10b7590969fe6870ce01a0
38.153924
apache-2.0
1
14
true
false
false
false
3.865655
0.606626
60.662597
0.635007
47.017086
0.371601
37.160121
0.372483
16.331096
0.475729
19.632812
0.533078
48.119829
false
false
2024-12-01
2024-12-01
3
Qwen/Qwen2.5-14B
qingy2019_Qwen2.5-Math-14B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2019/Qwen2.5-Math-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/Qwen2.5-Math-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__Qwen2.5-Math-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2019/Qwen2.5-Math-14B-Instruct
025d9637208b862c7b10b7590969fe6870ce01a0
36.380504
apache-2.0
1
14
true
false
false
false
1.971893
0.600531
60.053104
0.635649
47.065572
0.276435
27.643505
0.369128
15.883669
0.475667
19.425
0.53391
48.212175
false
false
2024-12-01
2024-12-01
3
Qwen/Qwen2.5-14B
qingy2019_Qwen2.5-Math-14B-Instruct-Alpha_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2019/Qwen2.5-Math-14B-Instruct-Alpha" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/Qwen2.5-Math-14B-Instruct-Alpha</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__Qwen2.5-Math-14B-Instruct-Alpha-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2019/Qwen2.5-Math-14B-Instruct-Alpha
e24aaa0779b576301bfb62b93789dea24ab10c88
36.840705
apache-2.0
2
14
true
false
false
false
3.786283
0.598083
59.808309
0.637508
47.750108
0.314199
31.41994
0.369966
15.995526
0.464938
17.950521
0.533078
48.119829
false
false
2024-12-03
2024-12-03
2
Qwen/Qwen2.5-14B
qingy2019_Qwen2.5-Math-14B-Instruct-Pro_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2019/Qwen2.5-Math-14B-Instruct-Pro" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/Qwen2.5-Math-14B-Instruct-Pro</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__Qwen2.5-Math-14B-Instruct-Pro-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2019/Qwen2.5-Math-14B-Instruct-Pro
295a9ce370c2bfeabe13f76d52c92f57ff6d0308
20.249049
0
14.766
false
false
false
true
3.319139
0.192168
19.216789
0.531869
33.036904
0.283988
28.398792
0.311242
8.165548
0.374031
4.253906
0.355801
28.422355
false
false
2024-12-03
2024-12-03
1
qingy2019/Qwen2.5-Math-14B-Instruct-Pro (Merge)
qingy2019_Qwen2.5-Ultimate-14B-Instruct_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2019/Qwen2.5-Ultimate-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2019/Qwen2.5-Ultimate-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2019__Qwen2.5-Ultimate-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2019/Qwen2.5-Ultimate-14B-Instruct
3eeba743112bed957ae6dc6a3f880355c8bedb66
29.440181
1
14.766
false
false
false
true
3.904179
0.393802
39.380178
0.584156
40.580601
0.289275
28.927492
0.356544
14.205817
0.4135
9.8875
0.492936
43.659501
false
false
2024-12-02
2024-12-02
1
qingy2019/Qwen2.5-Ultimate-14B-Instruct (Merge)
qingy2024_Benchmaxx-Llama-3.2-1B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2024/Benchmaxx-Llama-3.2-1B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Benchmaxx-Llama-3.2-1B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Benchmaxx-Llama-3.2-1B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2024/Benchmaxx-Llama-3.2-1B-Instruct
66ec83182f1dfbad634582eb14606e6b64355f91
25.696667
2
1.236
true
false
false
true
0.313886
0.20136
20.136017
0.826914
76.699969
0.480363
48.036254
0.283557
4.474273
0.344635
3.579427
0.111287
1.254063
false
false
2025-02-22
2025-02-22
0
qingy2024/Benchmaxx-Llama-3.2-1B-Instruct
qingy2024_Eyas-17B-Instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2024/Eyas-17B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Eyas-17B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Eyas-17B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2024/Eyas-17B-Instruct
afa6aa65deaef3eeb733e80f0fbffcf6d70a863f
32.566805
0
17.431
false
false
false
true
4.571522
0.657459
65.745888
0.608455
43.850066
0.246979
24.697885
0.314597
8.612975
0.452167
15.354167
0.434259
37.139849
false
false
2024-12-23
2024-12-23
1
qingy2024/Eyas-17B-Instruct (Merge)
qingy2024_Falcon3-2x10B-MoE-Instruct_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2024/Falcon3-2x10B-MoE-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Falcon3-2x10B-MoE-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Falcon3-2x10B-MoE-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2024/Falcon3-2x10B-MoE-Instruct
e226b1f0beb60ff1e3770a694af51572b6d95dc5
35.533684
apache-2.0
0
18.799
true
true
false
true
4.386574
0.784978
78.49783
0.618493
45.073853
0.279456
27.945619
0.330537
10.738255
0.428354
12.910937
0.44232
38.035609
true
false
2024-12-25
2024-12-25
1
qingy2024/Falcon3-2x10B-MoE-Instruct (Merge)
qingy2024_Fusion-14B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2024/Fusion-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Fusion-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Fusion-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2024/Fusion-14B-Instruct
2e15219659b919e04ad5b56bef259489cc264f09
38.097423
1
14
false
false
false
true
3.247362
0.725977
72.597707
0.639593
48.579836
0.336858
33.685801
0.354866
13.982103
0.440042
14.805208
0.504405
44.93388
false
false
2024-12-05
2024-12-05
1
qingy2024/Fusion-14B-Instruct (Merge)
qingy2024_Fusion2-14B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2024/Fusion2-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Fusion2-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Fusion2-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2024/Fusion2-14B-Instruct
df00288ce3d37ef518189c19e7973e71b47ef214
35.257797
1
14.766
false
false
false
true
3.337332
0.606401
60.640102
0.611852
44.767044
0.312689
31.268882
0.344799
12.639821
0.463385
17.223177
0.50507
45.007757
false
false
2024-12-05
2024-12-06
1
qingy2024/Fusion2-14B-Instruct (Merge)
qingy2024_Fusion4-14B-Instruct_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/qingy2024/Fusion4-14B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">qingy2024/Fusion4-14B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/qingy2024__Fusion4-14B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
qingy2024/Fusion4-14B-Instruct
3f3c7178006857d7fdf942ab7e86bd2b0d7b624d
39.552181
0
14.77
false
false
false
true
3.645661
0.764895
76.489492
0.654252
50.695856
0.388218
38.821752
0.330537
10.738255
0.432573
13.971615
0.519365
46.596114
false
false
2024-12-25
2024-12-25
1
qingy2024/Fusion4-14B-Instruct (Merge)