eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 7
values | T
stringclasses 7
values | Weight type
stringclasses 2
values | Architecture
stringclasses 64
values | Model
stringlengths 355
689
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 0.74
52.1
| Hub License
stringclasses 27
values | Hub ❤️
int64 0
6.09k
| #Params (B)
float64 -1
141
| Available on the hub
bool 2
classes | MoE
bool 2
classes | Flagged
bool 2
classes | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.04
187
| IFEval Raw
float64 0
0.9
| IFEval
float64 0
90
| BBH Raw
float64 0.22
0.83
| BBH
float64 0.25
76.7
| MATH Lvl 5 Raw
float64 0
0.71
| MATH Lvl 5
float64 0
71.5
| GPQA Raw
float64 0.21
0.47
| GPQA
float64 0
29.4
| MUSR Raw
float64 0.29
0.6
| MUSR
float64 0
38.7
| MMLU-PRO Raw
float64 0.1
0.73
| MMLU-PRO
float64 0
70
| Merged
bool 2
classes | Official Providers
bool 2
classes | Upload To Hub Date
stringclasses 525
values | Submission Date
stringclasses 263
values | Generation
int64 0
10
| Base Model
stringlengths 4
102
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
CultriX_Qwen2.5-14B-BrocaV9_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-BrocaV9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-BrocaV9</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-BrocaV9-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/Qwen2.5-14B-BrocaV9 | 883dafbff4edb8c83ef58a33413d4e09e922a53d | 39.258747 | 2 | 14.766 | false | false | false | false | 3.548006 | 0.676293 | 67.629335 | 0.639138 | 48.053225 | 0.38142 | 38.141994 | 0.364094 | 15.212528 | 0.469031 | 18.395573 | 0.533078 | 48.119829 | false | false | 2025-01-02 | 2025-01-10 | 1 | CultriX/Qwen2.5-14B-BrocaV9 (Merge) |
|
CultriX_Qwen2.5-14B-Brocav3_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Brocav3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Brocav3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Brocav3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/Qwen2.5-14B-Brocav3 | 6f3fe686a79dcbcd5835ca100e194c49f493167b | 39.846832 | 2 | 14.766 | false | false | false | false | 3.633478 | 0.695178 | 69.517768 | 0.645235 | 49.049112 | 0.387462 | 38.746224 | 0.35906 | 14.541387 | 0.475635 | 19.254427 | 0.531749 | 47.972074 | false | false | 2024-12-23 | 2024-12-23 | 1 | CultriX/Qwen2.5-14B-Brocav3 (Merge) |
|
CultriX_Qwen2.5-14B-Brocav6_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Brocav6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Brocav6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Brocav6-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/Qwen2.5-14B-Brocav6 | bd981505b6950df69216b260c3c0d86124fded7b | 39.84073 | 2 | 14.766 | false | false | false | false | 3.582802 | 0.699524 | 69.952393 | 0.638884 | 47.819225 | 0.387462 | 38.746224 | 0.36745 | 15.659955 | 0.474208 | 18.876042 | 0.531915 | 47.990544 | false | false | 2024-12-23 | 2024-12-23 | 1 | CultriX/Qwen2.5-14B-Brocav6 (Merge) |
|
CultriX_Qwen2.5-14B-Brocav7_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Brocav7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Brocav7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Brocav7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/Qwen2.5-14B-Brocav7 | 06acee7f6e9796081ced6201001784907c77f96f | 39.61738 | 0 | 14.766 | false | false | false | false | 3.402699 | 0.672372 | 67.237153 | 0.644403 | 48.905361 | 0.384441 | 38.444109 | 0.36745 | 15.659955 | 0.479604 | 20.150521 | 0.525765 | 47.307181 | false | false | 2024-12-23 | 0 | Removed |
||
CultriX_Qwen2.5-14B-Emerged_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Emerged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Emerged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Emerged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/Qwen2.5-14B-Emerged | 8bf0e31b23ee22858bbde2cee44dde88963f5084 | 37.952143 | 0 | 14.766 | false | false | false | false | 3.61472 | 0.700024 | 70.002371 | 0.626003 | 45.932419 | 0.324773 | 32.477341 | 0.357383 | 14.317673 | 0.469094 | 18.470052 | 0.518617 | 46.513002 | false | false | 2024-12-19 | 0 | Removed |
||
CultriX_Qwen2.5-14B-Emergedv3_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Emergedv3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Emergedv3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Emergedv3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/Qwen2.5-14B-Emergedv3 | f4df1b9c2bf37bbfd6b2e8f2ff244c6029a5d546 | 38.656292 | 0 | 14.766 | false | false | false | false | 3.837857 | 0.638849 | 63.884936 | 0.619073 | 44.731608 | 0.435801 | 43.58006 | 0.360738 | 14.765101 | 0.472813 | 18.601563 | 0.51737 | 46.374483 | false | false | 2024-12-21 | 0 | Removed |
||
CultriX_Qwen2.5-14B-FinalMerge_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-FinalMerge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-FinalMerge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-FinalMerge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/Qwen2.5-14B-FinalMerge | 8fd624d0d8989a312d344772814da3575423897a | 32.23627 | 0 | 14.766 | false | false | false | false | 3.887883 | 0.489098 | 48.909782 | 0.571495 | 38.162479 | 0.38142 | 38.141994 | 0.354866 | 13.982103 | 0.437906 | 14.504948 | 0.457447 | 39.716312 | false | false | 2024-12-23 | 0 | Removed |
||
CultriX_Qwen2.5-14B-Hyper_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Hyper" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Hyper</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Hyper-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/Qwen2.5-14B-Hyper | a6399c43f84736ed1b11d8cc7a25edf634781207 | 37.761935 | 0 | 14.766 | false | false | false | false | 7.678342 | 0.539132 | 53.913173 | 0.650745 | 49.759879 | 0.343656 | 34.365559 | 0.391779 | 18.903803 | 0.489833 | 21.029167 | 0.5374 | 48.60003 | false | false | 2025-01-19 | 0 | Removed |
||
CultriX_Qwen2.5-14B-HyperMarck-dl_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-HyperMarck-dl" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-HyperMarck-dl</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-HyperMarck-dl-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/Qwen2.5-14B-HyperMarck-dl | 77ca2edd6650455182d0c7e6a7be4249cfc34f8c | 39.894168 | apache-2.0 | 0 | 14.766 | true | false | false | false | 1.968583 | 0.665028 | 66.502768 | 0.609648 | 43.785859 | 0.528701 | 52.870091 | 0.36745 | 15.659955 | 0.441563 | 15.095312 | 0.509059 | 45.45102 | true | false | 2025-02-16 | 2025-02-16 | 1 | CultriX/Qwen2.5-14B-HyperMarck-dl (Merge) |
CultriX_Qwen2.5-14B-Hyperionv3_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Hyperionv3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Hyperionv3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Hyperionv3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/Qwen2.5-14B-Hyperionv3 | bc36be5b5ca3053ae96d85e962249efd0b283c82 | 39.762121 | 4 | 14.766 | false | false | false | false | 3.965711 | 0.683637 | 68.363719 | 0.652217 | 49.950055 | 0.370091 | 37.009063 | 0.370805 | 16.107383 | 0.472969 | 18.921094 | 0.533993 | 48.22141 | false | false | 2025-01-10 | 2025-01-19 | 1 | CultriX/Qwen2.5-14B-Hyperionv3 (Merge) |
|
CultriX_Qwen2.5-14B-Hyperionv4_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Hyperionv4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Hyperionv4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Hyperionv4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/Qwen2.5-14B-Hyperionv4 | 60cc366b0648bcb40ed22ebc53d64cc5aca25550 | 37.670019 | 3 | 14.766 | false | false | false | false | 4.073614 | 0.54158 | 54.157968 | 0.647179 | 49.07652 | 0.347432 | 34.743202 | 0.397651 | 19.686801 | 0.483198 | 19.866406 | 0.536403 | 48.489214 | false | false | 2025-01-19 | 2025-01-19 | 1 | CultriX/Qwen2.5-14B-Hyperionv4 (Merge) |
|
CultriX_Qwen2.5-14B-Hyperionv5_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Hyperionv5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Hyperionv5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Hyperionv5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/Qwen2.5-14B-Hyperionv5 | e0f4941349664a75ddd03e4d2c190284c951e54b | 39.72497 | 2 | 14.766 | false | false | false | false | 3.973468 | 0.672921 | 67.292118 | 0.644266 | 48.94828 | 0.382175 | 38.217523 | 0.371644 | 16.219239 | 0.479542 | 19.876042 | 0.53017 | 47.796616 | false | false | 2025-01-19 | 2025-01-19 | 1 | CultriX/Qwen2.5-14B-Hyperionv5 (Merge) |
|
CultriX_Qwen2.5-14B-MegaMerge-pt2_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-MegaMerge-pt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-MegaMerge-pt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-MegaMerge-pt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/Qwen2.5-14B-MegaMerge-pt2 | 20397f6cafc09c2cb74f105867cd99b3c68c71dc | 38.79653 | 0 | 14.766 | false | false | false | false | 4.500868 | 0.568308 | 56.830765 | 0.65777 | 50.907903 | 0.399547 | 39.954683 | 0.379195 | 17.225951 | 0.472875 | 18.742708 | 0.542055 | 49.117169 | false | false | 2024-10-25 | 0 | Removed |
||
CultriX_Qwen2.5-14B-MergeStock_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-MergeStock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-MergeStock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-MergeStock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/Qwen2.5-14B-MergeStock | fa00543296f2731793dfb0aac667571ccf1abb5b | 38.744236 | 0 | 14.766 | false | false | false | false | 6.645908 | 0.568533 | 56.85326 | 0.657934 | 51.009391 | 0.414653 | 41.465257 | 0.373322 | 16.442953 | 0.467635 | 17.854427 | 0.539561 | 48.84013 | false | false | 2024-10-24 | 0 | Removed |
||
CultriX_Qwen2.5-14B-ReasoningMerge_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-ReasoningMerge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-ReasoningMerge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-ReasoningMerge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/Qwen2.5-14B-ReasoningMerge | a310eb51c1cdcd4217e2aa303f7aac938dcc9ae1 | 40.645886 | apache-2.0 | 3 | 14.766 | true | false | false | false | 3.625355 | 0.460547 | 46.05469 | 0.657823 | 50.867898 | 0.520393 | 52.039275 | 0.407718 | 21.029083 | 0.516594 | 25.607552 | 0.534491 | 48.276817 | true | false | 2025-02-18 | 2025-02-18 | 1 | CultriX/Qwen2.5-14B-ReasoningMerge (Merge) |
CultriX_Qwen2.5-14B-Ultimav2_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Ultimav2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Ultimav2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Ultimav2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/Qwen2.5-14B-Ultimav2 | 9c805171d56f5d8720c687084c1ffc26bdf0acba | 38.8356 | apache-2.0 | 4 | 14.766 | true | false | false | false | 5.907627 | 0.550023 | 55.002283 | 0.655503 | 50.441053 | 0.384441 | 38.444109 | 0.385067 | 18.008949 | 0.496563 | 22.036979 | 0.541722 | 49.08023 | true | false | 2025-02-04 | 2025-02-05 | 1 | CultriX/Qwen2.5-14B-Ultimav2 (Merge) |
CultriX_Qwen2.5-14B-Unity_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Unity" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Unity</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Unity-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/Qwen2.5-14B-Unity | 1d15e7941e6ceff5d6e4f293378947bee721a24d | 38.299229 | 3 | 14.766 | false | false | false | false | 3.827378 | 0.673895 | 67.389526 | 0.601996 | 42.258617 | 0.431269 | 43.126888 | 0.347315 | 12.975391 | 0.467948 | 18.760156 | 0.507563 | 45.284796 | false | false | 2024-12-21 | 2024-12-21 | 1 | CultriX/Qwen2.5-14B-Unity (Merge) |
|
CultriX_Qwen2.5-14B-Wernicke_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Wernicke" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Wernicke</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Wernicke-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/Qwen2.5-14B-Wernicke | 622c0a58ecb0c0c679d7381a823d2ae5ac2b8ce1 | 37.943351 | apache-2.0 | 7 | 14.77 | true | false | false | false | 4.444469 | 0.52347 | 52.346995 | 0.656836 | 50.642876 | 0.38142 | 38.141994 | 0.393456 | 19.127517 | 0.468906 | 18.246615 | 0.542387 | 49.154108 | true | false | 2024-10-21 | 2024-10-22 | 1 | CultriX/Qwen2.5-14B-Wernicke (Merge) |
CultriX_Qwen2.5-14B-Wernicke-SFT_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Wernicke-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Wernicke-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Wernicke-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/Qwen2.5-14B-Wernicke-SFT | 3b68dfba2cf79e4a15e8f4271f7d4b62d2ab9f26 | 33.549512 | apache-2.0 | 2 | 14.77 | true | false | false | true | 2.786025 | 0.493744 | 49.374438 | 0.646059 | 49.330572 | 0.359517 | 35.951662 | 0.354027 | 13.870246 | 0.39 | 7.55 | 0.506981 | 45.220154 | true | false | 2024-11-16 | 2024-11-17 | 1 | CultriX/Qwen2.5-14B-Wernicke-SFT (Merge) |
CultriX_Qwen2.5-14B-Wernicke-SLERP_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Wernicke-SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Wernicke-SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Wernicke-SLERP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/Qwen2.5-14B-Wernicke-SLERP | 180175561e8061be067fc349ad4491270f19976f | 36.543652 | 0 | 14.491 | false | false | false | true | 4.311975 | 0.55889 | 55.889041 | 0.644093 | 49.372327 | 0.44864 | 44.864048 | 0.34396 | 12.527964 | 0.414031 | 11.120573 | 0.509392 | 45.487958 | false | false | 2024-10-25 | 0 | Removed |
||
CultriX_Qwen2.5-14B-Wernickev3_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Wernickev3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Wernickev3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Wernickev3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/Qwen2.5-14B-Wernickev3 | bd141b0df78ad1f6e2938edf167c2305b395a2b2 | 38.381142 | 3 | 14.766 | false | false | false | false | 3.831269 | 0.70482 | 70.481988 | 0.618415 | 44.576275 | 0.35423 | 35.422961 | 0.362416 | 14.988814 | 0.471667 | 18.691667 | 0.515126 | 46.125148 | false | false | 2024-12-19 | 2024-12-19 | 1 | CultriX/Qwen2.5-14B-Wernickev3 (Merge) |
|
CultriX_Qwen2.5-14B-partialmergept1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-partialmergept1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-partialmergept1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-partialmergept1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/Qwen2.5-14B-partialmergept1 | 02c6491a2affea23c1e5d89d324a90d24a0e5381 | 39.108717 | 0 | 14.766 | false | false | false | false | 4.018672 | 0.633729 | 63.372851 | 0.615118 | 44.594404 | 0.453927 | 45.392749 | 0.361577 | 14.876957 | 0.475698 | 19.66224 | 0.520778 | 46.753103 | false | false | 2025-01-19 | 0 | Removed |
||
CultriX_Qwenfinity-2.5-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/Qwenfinity-2.5-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwenfinity-2.5-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwenfinity-2.5-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/Qwenfinity-2.5-14B | 6acc1308274031b045f028b0a0290cdbe4243a04 | 32.322008 | 0 | 14.766 | false | false | false | false | 3.954133 | 0.481379 | 48.137941 | 0.565501 | 37.259942 | 0.410121 | 41.012085 | 0.348993 | 13.199105 | 0.450583 | 15.45625 | 0.449801 | 38.866726 | false | false | 2024-12-23 | 0 | Removed |
||
CultriX_Qwestion-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/Qwestion-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwestion-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwestion-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/Qwestion-14B | e286bfafbc28e36859202c9f06ed8287a4f1d8b6 | 38.549226 | 0 | 14.766 | false | false | false | false | 3.707642 | 0.63178 | 63.178034 | 0.64501 | 48.757034 | 0.372356 | 37.23565 | 0.368289 | 15.771812 | 0.463604 | 17.217188 | 0.542221 | 49.135638 | false | false | 2024-11-23 | 0 | Removed |
||
CultriX_SeQwence-14B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/SeQwence-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/SeQwence-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__SeQwence-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/SeQwence-14B | f4a147b717ba0e9392f96e343250b00239196a22 | 36.886273 | apache-2.0 | 3 | 14.766 | true | false | false | false | 3.592765 | 0.53516 | 53.516004 | 0.650567 | 50.163578 | 0.353474 | 35.347432 | 0.360738 | 14.765101 | 0.466615 | 18.426823 | 0.541888 | 49.0987 | false | false | 2024-11-20 | 2024-11-20 | 0 | CultriX/SeQwence-14B |
CultriX_SeQwence-14B-EvolMerge_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/SeQwence-14B-EvolMerge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/SeQwence-14B-EvolMerge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__SeQwence-14B-EvolMerge-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/SeQwence-14B-EvolMerge | a98c932f0d71d76883fe9aa9d708af0506b01343 | 38.018641 | apache-2.0 | 2 | 14.766 | true | false | false | false | 3.901652 | 0.538158 | 53.815764 | 0.657218 | 50.780351 | 0.367069 | 36.706949 | 0.380872 | 17.449664 | 0.482083 | 20.260417 | 0.541888 | 49.0987 | true | false | 2024-11-27 | 2024-11-27 | 1 | CultriX/SeQwence-14B-EvolMerge (Merge) |
CultriX_SeQwence-14B-EvolMergev1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/SeQwence-14B-EvolMergev1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/SeQwence-14B-EvolMergev1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__SeQwence-14B-EvolMergev1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/SeQwence-14B-EvolMergev1 | 6cc7116cdea757635dba52bb82a306654d118e77 | 38.463462 | 2 | 14.766 | false | false | false | false | 3.915792 | 0.555468 | 55.546838 | 0.654555 | 50.302259 | 0.42145 | 42.145015 | 0.376678 | 16.89038 | 0.462271 | 17.083854 | 0.539312 | 48.812426 | false | false | 2024-11-25 | 2024-11-27 | 1 | CultriX/SeQwence-14B-EvolMergev1 (Merge) |
|
CultriX_SeQwence-14B-v5_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/SeQwence-14B-v5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/SeQwence-14B-v5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__SeQwence-14B-v5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/SeQwence-14B-v5 | 9f43ad41542be56f6a18f31bfa60086318735ed5 | 37.608542 | 0 | 14.766 | false | false | false | false | 3.73032 | 0.591988 | 59.198815 | 0.651709 | 49.995731 | 0.330816 | 33.081571 | 0.369966 | 15.995526 | 0.471417 | 18.327083 | 0.541473 | 49.052527 | false | false | 2024-11-18 | 0 | Removed |
||
CultriX_SeQwence-14Bv1_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/SeQwence-14Bv1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/SeQwence-14Bv1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__SeQwence-14Bv1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/SeQwence-14Bv1 | 542bfbd2e6fb25ecd11b84d956764eb23233a034 | 38.625628 | apache-2.0 | 2 | 14.766 | true | false | false | false | 3.660382 | 0.6678 | 66.780033 | 0.634467 | 47.190898 | 0.361027 | 36.102719 | 0.361577 | 14.876957 | 0.470427 | 18.803385 | 0.531998 | 47.999778 | true | false | 2024-11-24 | 2024-11-27 | 1 | CultriX/SeQwence-14Bv1 (Merge) |
CultriX_SeQwence-14Bv2_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/SeQwence-14Bv2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/SeQwence-14Bv2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__SeQwence-14Bv2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/SeQwence-14Bv2 | 674c6d49b604fdf26e327e1e86c4fde0724b98e8 | 38.740075 | 0 | 14.766 | false | false | false | false | 3.949787 | 0.578599 | 57.859923 | 0.630451 | 46.529224 | 0.475831 | 47.583082 | 0.360738 | 14.765101 | 0.460104 | 17.546354 | 0.533411 | 48.156767 | false | false | 2024-12-08 | 0 | Removed |
||
CultriX_SeQwence-14Bv3_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/CultriX/SeQwence-14Bv3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/SeQwence-14Bv3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__SeQwence-14Bv3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | CultriX/SeQwence-14Bv3 | b3f2b5273bbc996814a25aa9060fd6f4c0d93bca | 38.665816 | 2 | 14.766 | false | false | false | false | 3.930149 | 0.571905 | 57.190477 | 0.630225 | 46.385368 | 0.476586 | 47.65861 | 0.364933 | 15.324385 | 0.462427 | 17.270052 | 0.533494 | 48.166002 | false | false | 2024-11-27 | 2024-11-27 | 1 | CultriX/SeQwence-14Bv3 (Merge) |
|
DRXD1000_Atlas-7B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/DRXD1000/Atlas-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DRXD1000/Atlas-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DRXD1000__Atlas-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | DRXD1000/Atlas-7B | 967ee983e2a0b163c12da69f1f81aaf8ffb2a456 | 8.786577 | apache-2.0 | 0 | 7.768 | true | false | false | true | 2.513517 | 0.370446 | 37.044597 | 0.330218 | 7.540208 | 0.018882 | 1.888218 | 0.25755 | 1.006711 | 0.33425 | 0.78125 | 0.140126 | 4.458481 | false | false | 2024-12-10 | 2024-12-10 | 0 | DRXD1000/Atlas-7B |
DRXD1000_Phoenix-7B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/DRXD1000/Phoenix-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DRXD1000/Phoenix-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DRXD1000__Phoenix-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | DRXD1000/Phoenix-7B | a5caa8036d8b7819eb723debe3f037471b5c4882 | 12.420154 | apache-2.0 | 17 | 7.242 | true | false | false | true | 0.941745 | 0.320962 | 32.096171 | 0.393157 | 15.62018 | 0.016616 | 1.661631 | 0.278523 | 3.803132 | 0.384948 | 6.41849 | 0.234292 | 14.921321 | false | false | 2024-01-10 | 2024-12-11 | 0 | DRXD1000/Phoenix-7B |
DUAL-GPO_zephyr-7b-ipo-0k-15k-i1_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Adapter | ? | <a target="_blank" href="https://huggingface.co/DUAL-GPO/zephyr-7b-ipo-0k-15k-i1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DUAL-GPO/zephyr-7b-ipo-0k-15k-i1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DUAL-GPO__zephyr-7b-ipo-0k-15k-i1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | DUAL-GPO/zephyr-7b-ipo-0k-15k-i1 | 564d269c67dfcc5c07a4fbc270a6a48da1929d30 | 15.492948 | 0 | 14.483 | false | false | false | false | 1.942847 | 0.275624 | 27.562423 | 0.447271 | 22.658643 | 0.030211 | 3.021148 | 0.291107 | 5.480984 | 0.417344 | 10.567969 | 0.312999 | 23.666519 | false | false | 2024-09-20 | 2024-09-22 | 1 | DUAL-GPO/zephyr-7b-ipo-qlora-v0-merged |
|
DZgas_GIGABATEMAN-7B_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/DZgas/GIGABATEMAN-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DZgas/GIGABATEMAN-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DZgas__GIGABATEMAN-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | DZgas/GIGABATEMAN-7B | edf2840350e7fd55895d9df560b489ac10ecb95e | 20.471469 | 7 | 7.242 | false | false | false | false | 1.260675 | 0.460746 | 46.074638 | 0.503218 | 29.827517 | 0.055136 | 5.513595 | 0.28943 | 5.257271 | 0.432844 | 11.972135 | 0.317653 | 24.183658 | false | false | 2024-04-17 | 2024-09-15 | 1 | DZgas/GIGABATEMAN-7B (Merge) |
|
Daemontatox_AetherDrake-SFT_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/AetherDrake-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/AetherDrake-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__AetherDrake-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/AetherDrake-SFT | 17a0f90f0c06f2adc885faccd0a6172a7b996126 | 22.917961 | apache-2.0 | 1 | 8.03 | true | false | false | false | 2.196694 | 0.48128 | 48.127967 | 0.487201 | 27.139252 | 0.151057 | 15.10574 | 0.32047 | 9.395973 | 0.408844 | 9.972135 | 0.3499 | 27.766696 | false | false | 2024-12-24 | 2024-12-25 | 1 | Daemontatox/AetherDrake-SFT (Merge) |
Daemontatox_AetherSett_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/AetherSett" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/AetherSett</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__AetherSett-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/AetherSett | d8d86c6dc1b693192931b02e39290eca331ae84e | 31.420123 | apache-2.0 | 1 | 7.616 | true | false | false | false | 1.964645 | 0.536959 | 53.69586 | 0.545162 | 34.744146 | 0.397281 | 39.728097 | 0.307886 | 7.718121 | 0.460312 | 16.205729 | 0.427859 | 36.428783 | false | false | 2024-12-30 | 2024-12-30 | 3 | Qwen/Qwen2.5-7B |
Daemontatox_AetherTOT_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MllamaForConditionalGeneration | <a target="_blank" href="https://huggingface.co/Daemontatox/AetherTOT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/AetherTOT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__AetherTOT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/AetherTOT | 71d99f8fb69276422daae61222e57087000c05b0 | 23.178825 | apache-2.0 | 0 | 10.67 | true | false | false | false | 1.397847 | 0.439764 | 43.976427 | 0.506606 | 29.436391 | 0.148792 | 14.879154 | 0.323826 | 9.8434 | 0.407854 | 9.781771 | 0.380402 | 31.155807 | false | false | 2024-12-27 | 2024-12-28 | 2 | meta-llama/Llama-3.2-11B-Vision-Instruct |
Daemontatox_AetherTOT_bfloat16 | bfloat16 | 🌸 multimodal | 🌸 | Original | MllamaForConditionalGeneration | <a target="_blank" href="https://huggingface.co/Daemontatox/AetherTOT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/AetherTOT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__AetherTOT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/AetherTOT | 71d99f8fb69276422daae61222e57087000c05b0 | 22.874708 | apache-2.0 | 0 | 10.67 | true | false | false | false | 0.708698 | 0.43829 | 43.82904 | 0.503431 | 29.031857 | 0.14426 | 14.425982 | 0.323826 | 9.8434 | 0.405188 | 9.248438 | 0.377826 | 30.869533 | false | false | 2024-12-27 | 2024-12-28 | 2 | meta-llama/Llama-3.2-11B-Vision-Instruct |
Daemontatox_AetherUncensored_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/AetherUncensored" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/AetherUncensored</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__AetherUncensored-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/AetherUncensored | e498d645faab591062c6919a98b35656e2d0c783 | 18.374864 | 0 | 8.03 | false | false | false | false | 1.478506 | 0.404193 | 40.41931 | 0.446313 | 21.678618 | 0.145015 | 14.501511 | 0.288591 | 5.145414 | 0.374677 | 9.501302 | 0.271027 | 19.003029 | false | false | 2025-01-09 | 0 | Removed |
||
Daemontatox_Cogito-MIS_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/Cogito-MIS" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/Cogito-MIS</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__Cogito-MIS-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/Cogito-MIS | c1d59d3bc93d7ae4816800e37333f375e1debabf | 11.081962 | 0 | 23.572 | false | false | false | true | 1.765364 | 0.181452 | 18.145188 | 0.505998 | 29.07597 | 0.086103 | 8.610272 | 0.256711 | 0.894855 | 0.37676 | 4.928385 | 0.143534 | 4.837101 | false | false | 2025-02-18 | 0 | Removed |
||
Daemontatox_CogitoDistil_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/CogitoDistil" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/CogitoDistil</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__CogitoDistil-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/CogitoDistil | f9a5302a0c4b464c44d79f745b8498ab51dd97de | 17.180474 | 0 | 7.616 | false | false | false | true | 1.629079 | 0.277648 | 27.764775 | 0.367677 | 11.948759 | 0.392749 | 39.274924 | 0.259228 | 1.230425 | 0.37549 | 4.802865 | 0.26255 | 18.061096 | false | false | 2025-01-22 | 0 | Removed |
||
Daemontatox_CogitoZ_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/CogitoZ" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/CogitoZ</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__CogitoZ-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/CogitoZ | 7079c4e915e6f549df9f1c3fa3a3260f9a835f48 | 39.383291 | apache-2.0 | 0 | 32.764 | true | false | false | true | 8.863382 | 0.396724 | 39.672403 | 0.673449 | 53.889571 | 0.524169 | 52.416918 | 0.395134 | 19.35123 | 0.47926 | 19.940885 | 0.559259 | 51.028738 | false | false | 2025-01-03 | 2025-02-13 | 1 | Daemontatox/CogitoZ (Merge) |
Daemontatox_CogitoZ14_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/CogitoZ14" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/CogitoZ14</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__CogitoZ14-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/CogitoZ14 | df5320d7ff115f1e39e42506ed86a340eb2d12e0 | 34.38343 | 0 | 14.77 | false | false | false | true | 5.19345 | 0.663703 | 66.370342 | 0.629751 | 46.479352 | 0.422205 | 42.220544 | 0.316275 | 8.836689 | 0.405875 | 9.067708 | 0.399934 | 33.325946 | false | false | 2025-01-07 | 0 | Removed |
||
Daemontatox_DocumentCogito_bfloat16 | bfloat16 | 🌸 multimodal | 🌸 | Original | MllamaForConditionalGeneration | <a target="_blank" href="https://huggingface.co/Daemontatox/DocumentCogito" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/DocumentCogito</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__DocumentCogito-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/DocumentCogito | 23dcfc6bf91d84db1c977b151fd0923270d3e3ef | 24.220439 | apache-2.0 | 1 | 10.67 | true | false | false | false | 1.413317 | 0.506434 | 50.643404 | 0.511156 | 29.793609 | 0.163142 | 16.314199 | 0.316275 | 8.836689 | 0.397313 | 8.597396 | 0.380236 | 31.137337 | false | false | 2025-01-16 | 2025-01-16 | 2 | meta-llama/Llama-3.2-11B-Vision-Instruct |
Daemontatox_DocumentCogito_float16 | float16 | 🌸 multimodal | 🌸 | Original | MllamaForConditionalGeneration | <a target="_blank" href="https://huggingface.co/Daemontatox/DocumentCogito" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/DocumentCogito</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__DocumentCogito-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/DocumentCogito | 9bdbfd8f330754c4103822ce180e0e3e3ce0973e | 29.108156 | apache-2.0 | 1 | 10.67 | true | false | false | true | 0.711757 | 0.777035 | 77.703493 | 0.518673 | 31.184823 | 0.219789 | 21.978852 | 0.293624 | 5.816555 | 0.391052 | 7.548177 | 0.373753 | 30.417036 | false | false | 2025-01-16 | 2025-03-09 | 2 | meta-llama/Llama-3.2-11B-Vision-Instruct |
Daemontatox_Llama3.3-70B-CogniLink_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/Llama3.3-70B-CogniLink" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/Llama3.3-70B-CogniLink</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__Llama3.3-70B-CogniLink-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/Llama3.3-70B-CogniLink | 69f134f69472a84d104d3ef0c0b1dd200b9a599d | 42.774714 | apache-2.0 | 2 | 70.554 | true | false | false | true | 32.378236 | 0.693104 | 69.31043 | 0.666833 | 52.124663 | 0.413897 | 41.389728 | 0.44547 | 26.06264 | 0.487698 | 21.395573 | 0.517287 | 46.365248 | false | false | 2025-01-10 | 2025-03-02 | 1 | Daemontatox/Llama3.3-70B-CogniLink (Merge) |
Daemontatox_Llama_cot_float16 | float16 | 🌸 multimodal | 🌸 | Original | MllamaForConditionalGeneration | <a target="_blank" href="https://huggingface.co/Daemontatox/Llama_cot" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/Llama_cot</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__Llama_cot-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/Llama_cot | e0b1e5ec44b5dac34aa3bf99e0faf7c6c3f1390f | 27.115742 | 0 | 10.67 | false | false | false | true | 0.750703 | 0.754878 | 75.487817 | 0.483837 | 26.866583 | 0.202417 | 20.241692 | 0.291107 | 5.480984 | 0.38724 | 6.638281 | 0.351812 | 27.979093 | false | false | 2025-03-09 | 0 | Removed |
||
Daemontatox_MawaredT1_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/MawaredT1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/MawaredT1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__MawaredT1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/MawaredT1 | 84a1d35d91b862a5cfc65988d4a0f65033b34c47 | 29.231298 | apache-2.0 | 1 | 7.616 | true | false | false | false | 1.276958 | 0.41988 | 41.988036 | 0.521482 | 31.900788 | 0.302115 | 30.21148 | 0.334732 | 11.297539 | 0.470208 | 18.676042 | 0.471825 | 41.313904 | false | false | 2025-01-02 | 2025-01-02 | 2 | arcee-ai/Meraj-Mini (Merge) |
Daemontatox_Mini_QwQ_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/Mini_QwQ" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/Mini_QwQ</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__Mini_QwQ-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/Mini_QwQ | e96df7ba6e989ee286da5d0b05a84525fdb56c53 | 30.832499 | 0 | 7.616 | false | false | false | false | 1.317404 | 0.449706 | 44.970567 | 0.554899 | 36.210285 | 0.419184 | 41.918429 | 0.303691 | 7.158837 | 0.46825 | 17.264583 | 0.437251 | 37.472296 | false | false | 2025-01-16 | 0 | Removed |
||
Daemontatox_NemoR_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/NemoR" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/NemoR</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__NemoR-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/NemoR | 688f1a4c3c69fe9c6440cad7919ab602ae61fa39 | 18.073998 | 0 | 6.124 | false | false | false | false | 2.261325 | 0.228738 | 22.873753 | 0.519407 | 31.60552 | 0.083082 | 8.308157 | 0.327181 | 10.290828 | 0.390802 | 9.916927 | 0.329039 | 25.448803 | false | false | 2024-12-31 | 0 | Removed |
||
Daemontatox_PathFinderAI2.0_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/PathFinderAI2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/PathFinderAI2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__PathFinderAI2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/PathFinderAI2.0 | bf8cfd82d4ceceb133058a78e1fe48436b50568a | 36.256652 | apache-2.0 | 0 | 32.764 | true | false | false | true | 14.003082 | 0.454102 | 45.410178 | 0.665823 | 52.956513 | 0.507553 | 50.755287 | 0.302013 | 6.935123 | 0.421563 | 10.961979 | 0.554688 | 50.520833 | false | false | 2024-12-30 | 2025-01-21 | 4 | Qwen/Qwen2.5-32B |
Daemontatox_PathFinderAi3.0_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/PathFinderAi3.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/PathFinderAi3.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__PathFinderAi3.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/PathFinderAi3.0 | 6c9aa17cee032523ce17de111d6865e33825cf1d | 40.458694 | apache-2.0 | 1 | 32.764 | true | false | false | true | 8.094724 | 0.427099 | 42.709899 | 0.688422 | 55.538355 | 0.504532 | 50.453172 | 0.408557 | 21.14094 | 0.480688 | 20.052604 | 0.575715 | 52.857196 | false | false | 2024-12-31 | 2025-01-21 | 1 | Daemontatox/PathFinderAI3.0 |
Daemontatox_PathfinderAI_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/PathfinderAI" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/PathfinderAI</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__PathfinderAI-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/PathfinderAI | 14c6a91351006b7be0aff85292733470ff1b546d | 38.131314 | apache-2.0 | 0 | 32.764 | true | false | false | false | 4.540918 | 0.374517 | 37.451739 | 0.666785 | 52.646547 | 0.475831 | 47.583082 | 0.394295 | 19.239374 | 0.485833 | 20.829167 | 0.559342 | 51.037973 | false | false | 2024-12-24 | 2024-12-25 | 1 | Daemontatox/PathfinderAI (Merge) |
Daemontatox_PathfinderAI_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/PathfinderAI" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/PathfinderAI</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__PathfinderAI-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/PathfinderAI | 7271fc7d08fca9b12c49b40af6245a982273a5c3 | 36.548768 | apache-2.0 | 0 | 32.764 | true | false | false | true | 9.451441 | 0.485501 | 48.550069 | 0.662734 | 52.322163 | 0.484139 | 48.413897 | 0.309564 | 7.941834 | 0.425594 | 11.599219 | 0.554189 | 50.465426 | false | false | 2024-12-24 | 2024-12-30 | 1 | Daemontatox/PathfinderAI (Merge) |
Daemontatox_Phi-4-COT_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/Phi-4-COT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/Phi-4-COT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__Phi-4-COT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/Phi-4-COT | bfc745d1a347b74843671eb50687c2e88c07ec7d | 26.128818 | 0 | 14.66 | false | false | false | false | 1.715153 | 0.179303 | 17.930314 | 0.617293 | 45.34299 | 0.22432 | 22.432024 | 0.33557 | 11.409396 | 0.453 | 15.158333 | 0.500499 | 44.499852 | false | false | 2025-01-11 | 0 | Removed |
||
Daemontatox_PixelParse_AI_bfloat16 | bfloat16 | 🌸 multimodal | 🌸 | Original | MllamaForConditionalGeneration | <a target="_blank" href="https://huggingface.co/Daemontatox/PixelParse_AI" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/PixelParse_AI</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__PixelParse_AI-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/PixelParse_AI | cc94604b91fc38513ca61f11dd9e1de1c3cc3b3d | 22.925061 | apache-2.0 | 0 | 10.67 | true | false | false | false | 1.400219 | 0.43829 | 43.82904 | 0.503431 | 29.031857 | 0.147281 | 14.728097 | 0.323826 | 9.8434 | 0.405188 | 9.248438 | 0.377826 | 30.869533 | false | false | 2024-12-27 | 2024-12-29 | 2 | meta-llama/Llama-3.2-11B-Vision-Instruct |
Daemontatox_RA2.0_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/RA2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/RA2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__RA2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/RA2.0 | e1505dd5f9f2c8549cc852a1aca3ec545638e813 | 23.232563 | 0 | 7.616 | false | false | false | false | 1.327376 | 0.378389 | 37.838934 | 0.488869 | 28.471838 | 0.383686 | 38.36858 | 0.305369 | 7.38255 | 0.409125 | 9.373958 | 0.261636 | 17.959515 | false | false | 2025-01-01 | 0 | Removed |
||
Daemontatox_RA_Reasoner_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/RA_Reasoner" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/RA_Reasoner</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__RA_Reasoner-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/RA_Reasoner | e799c6877cb70b6e78c1e337eaa58383040c8fa9 | 29.208003 | apache-2.0 | 2 | 10.306 | true | false | false | false | 1.558147 | 0.559215 | 55.92151 | 0.605369 | 43.073008 | 0.212236 | 21.223565 | 0.331376 | 10.850112 | 0.396354 | 7.510938 | 0.43002 | 36.668883 | false | false | 2024-12-20 | 2024-12-25 | 2 | tiiuae/Falcon3-10B-Base |
Daemontatox_RA_Reasoner2.0_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/RA_Reasoner2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/RA_Reasoner2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__RA_Reasoner2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/RA_Reasoner2.0 | 2a7477f34b171d2ae090e57abdbd997546dee242 | 29.039667 | apache-2.0 | 0 | 10.306 | true | false | false | false | 1.573513 | 0.536634 | 53.663391 | 0.606247 | 43.070069 | 0.231118 | 23.111782 | 0.324664 | 9.955257 | 0.388354 | 7.177604 | 0.435339 | 37.2599 | false | false | 2024-12-29 | 2024-12-29 | 3 | tiiuae/Falcon3-10B-Base |
Daemontatox_ReasonTest_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/ReasonTest" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/ReasonTest</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__ReasonTest-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/ReasonTest | 8e81cfddd97a13d81d6207eb72be8b730a7ca12f | 25.858233 | 0 | 3.808 | false | false | false | false | 1.340629 | 0.407965 | 40.796531 | 0.543526 | 35.375037 | 0.213746 | 21.374622 | 0.318792 | 9.17226 | 0.431542 | 12.076042 | 0.427194 | 36.354905 | false | false | 2024-12-31 | 0 | Removed |
||
Daemontatox_Research_PathfinderAI_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/Research_PathfinderAI" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/Research_PathfinderAI</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__Research_PathfinderAI-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/Research_PathfinderAI | eae32cc9dffa3a2493fd793f7b847e7bb3376853 | 9.365879 | 0 | 1.777 | false | false | false | true | 0.618841 | 0.345692 | 34.569165 | 0.287226 | 1.426346 | 0.16994 | 16.993958 | 0.240772 | 0 | 0.339396 | 1.757812 | 0.113032 | 1.447991 | false | false | 2025-02-21 | 0 | Removed |
||
Daemontatox_SphinX_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/SphinX" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/SphinX</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__SphinX-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/SphinX | 3da400d648b198211c81f61421bdcefac8073506 | 29.87478 | apache-2.0 | 2 | 7.616 | true | false | false | false | 1.304317 | 0.572504 | 57.250429 | 0.544058 | 34.712451 | 0.308157 | 30.81571 | 0.297819 | 6.375839 | 0.4405 | 12.695833 | 0.436586 | 37.398419 | false | false | 2024-12-21 | 2024-12-31 | 1 | Daemontatox/SphinX (Merge) |
Daemontatox_Sphinx2.0_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/Sphinx2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/Sphinx2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__Sphinx2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/Sphinx2.0 | 16abdfe2c214dc1da6bfe654b3d6716fcc8450e2 | 37.694185 | apache-2.0 | 0 | 14.77 | true | false | false | true | 3.59265 | 0.712313 | 71.231333 | 0.647284 | 49.396752 | 0.401813 | 40.181269 | 0.293624 | 5.816555 | 0.426031 | 13.053906 | 0.518368 | 46.485298 | false | false | 2024-12-30 | 2024-12-30 | 1 | Daemontatox/Sphinx2.0 (Merge) |
Daemontatox_TinySphinx_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/TinySphinx" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/TinySphinx</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__TinySphinx-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/TinySphinx | 62172ccb670864070581498fb12e7d2594ac3a77 | 8.167167 | 0 | 0.247 | false | false | false | false | 1.007256 | 0.25669 | 25.669003 | 0.330984 | 6.546576 | 0.043051 | 4.305136 | 0.27349 | 3.131991 | 0.33276 | 1.595052 | 0.169797 | 7.755245 | false | false | 2024-12-31 | 0 | Removed |
||
Daemontatox_TinySphinx2.0_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/TinySphinx2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/TinySphinx2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__TinySphinx2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/TinySphinx2.0 | accc28aa00084fe89801baa0885c291d18a031ec | 7.583927 | 0 | 0.247 | false | false | false | false | 1.004172 | 0.253517 | 25.351733 | 0.316841 | 5.004029 | 0.032477 | 3.247734 | 0.268456 | 2.46085 | 0.33825 | 1.314583 | 0.173122 | 8.124631 | false | false | 2024-12-31 | 0 | Removed |
||
Daemontatox_Zirel-7B-Math_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/Zirel-7B-Math" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/Zirel-7B-Math</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__Zirel-7B-Math-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/Zirel-7B-Math | 104d5e9f5df50c0782ff1a830f7ec3c4943210f3 | 30.976625 | apache-2.0 | 0 | 7.616 | true | false | false | true | 0.538753 | 0.663879 | 66.387851 | 0.54477 | 34.939441 | 0.197885 | 19.78852 | 0.326342 | 10.178971 | 0.478917 | 18.597917 | 0.423703 | 35.967051 | false | false | 2025-02-28 | 2025-02-28 | 3 | Qwen/Qwen2.5-7B |
Daemontatox_Zirel_1.5_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/Zirel_1.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/Zirel_1.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__Zirel_1.5-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/Zirel_1.5 | 53af159f98d8b428e719287f759500f95b601ee2 | 14.243506 | apache-2.0 | 0 | 1.544 | true | false | false | true | 0.579531 | 0.416758 | 41.675754 | 0.398467 | 15.082126 | 0.113293 | 11.329305 | 0.260067 | 1.342282 | 0.365813 | 3.326562 | 0.214345 | 12.705009 | false | false | 2025-03-04 | 2025-03-04 | 3 | Qwen/Qwen2.5-Coder-1.5B-Instruct (Merge) |
Daemontatox_mini-Cogito-R1_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/mini-Cogito-R1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/mini-Cogito-R1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__mini-Cogito-R1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/mini-Cogito-R1 | 7d86cfe7522a080853a6c25f7115fa5106c9d671 | 11.629718 | apache-2.0 | 4 | 1.777 | true | false | false | false | 0.610988 | 0.229837 | 22.983683 | 0.328049 | 6.038995 | 0.274924 | 27.492447 | 0.286913 | 4.9217 | 0.344698 | 2.98724 | 0.148188 | 5.354241 | false | false | 2025-02-22 | 2025-02-22 | 1 | Daemontatox/mini-Cogito-R1 (Merge) |
Daemontatox_mini_Pathfinder_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Daemontatox/mini_Pathfinder" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Daemontatox/mini_Pathfinder</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Daemontatox__mini_Pathfinder-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Daemontatox/mini_Pathfinder | 20d12c01e831675a563c978900bcf291def5f7dd | 19.872595 | 0 | 7.616 | false | false | false | true | 1.548263 | 0.296158 | 29.615753 | 0.395569 | 16.030028 | 0.475076 | 47.507553 | 0.258389 | 1.118568 | 0.378094 | 4.861719 | 0.280918 | 20.10195 | false | false | 2025-01-20 | 0 | Removed |
||
Dampfinchen_Llama-3.1-8B-Ultra-Instruct_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Dampfinchen/Llama-3.1-8B-Ultra-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dampfinchen/Llama-3.1-8B-Ultra-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dampfinchen__Llama-3.1-8B-Ultra-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Dampfinchen/Llama-3.1-8B-Ultra-Instruct | 46662d14130cfd34f7d90816540794f24a301f86 | 30.159277 | llama3 | 8 | 8.03 | true | false | false | true | 1.672957 | 0.808109 | 80.810915 | 0.525753 | 32.494587 | 0.220544 | 22.054381 | 0.291946 | 5.592841 | 0.400323 | 8.607031 | 0.382563 | 31.395907 | true | false | 2024-08-26 | 2024-08-26 | 1 | Dampfinchen/Llama-3.1-8B-Ultra-Instruct (Merge) |
Danielbrdz_Barcenas-10b_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Danielbrdz/Barcenas-10b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Danielbrdz/Barcenas-10b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Danielbrdz__Barcenas-10b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Danielbrdz/Barcenas-10b | 71884e96b88f6c86fca3a528ddf71c7745cb1d76 | 31.870971 | apache-2.0 | 1 | 10.306 | true | false | false | false | 1.61931 | 0.660781 | 66.078117 | 0.612083 | 43.769695 | 0.215257 | 21.52568 | 0.341443 | 12.192394 | 0.413469 | 10.316927 | 0.436087 | 37.343011 | false | false | 2025-01-04 | 2025-01-06 | 1 | Danielbrdz/Barcenas-10b (Merge) |
Danielbrdz_Barcenas-14b-Phi-3-medium-ORPO_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Danielbrdz/Barcenas-14b-Phi-3-medium-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Danielbrdz/Barcenas-14b-Phi-3-medium-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Danielbrdz__Barcenas-14b-Phi-3-medium-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Danielbrdz/Barcenas-14b-Phi-3-medium-ORPO | b749dbcb19901b8fd0e9f38c923a24533569f895 | 31.889505 | mit | 5 | 13.96 | true | false | false | true | 2.354154 | 0.479906 | 47.990554 | 0.653618 | 51.029418 | 0.202417 | 20.241692 | 0.326342 | 10.178971 | 0.48075 | 20.527083 | 0.472324 | 41.369311 | false | false | 2024-06-15 | 2024-08-13 | 0 | Danielbrdz/Barcenas-14b-Phi-3-medium-ORPO |
Danielbrdz_Barcenas-14b-phi-4_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Danielbrdz/Barcenas-14b-phi-4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Danielbrdz/Barcenas-14b-phi-4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Danielbrdz__Barcenas-14b-phi-4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Danielbrdz/Barcenas-14b-phi-4 | 53891d973087e8909e1c9cc968b7bf222247e2ab | 28.746056 | mit | 1 | 14.66 | true | false | false | false | 1.747853 | 0.049759 | 4.975908 | 0.67693 | 53.257692 | 0.258308 | 25.830816 | 0.383389 | 17.785235 | 0.509677 | 24.242969 | 0.517453 | 46.383717 | false | false | 2025-01-19 | 2025-01-26 | 1 | Danielbrdz/Barcenas-14b-phi-4 (Merge) |
Danielbrdz_Barcenas-14b-phi-4-v2_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Danielbrdz/Barcenas-14b-phi-4-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Danielbrdz/Barcenas-14b-phi-4-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Danielbrdz__Barcenas-14b-phi-4-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Danielbrdz/Barcenas-14b-phi-4-v2 | b602beb38b9a82ac497e6689751927eca9dbd876 | 31.447866 | mit | 0 | 14.66 | true | false | false | false | 1.988565 | 0.277473 | 27.747266 | 0.6573 | 50.20693 | 0.321752 | 32.175227 | 0.378356 | 17.114094 | 0.439948 | 14.29349 | 0.524352 | 47.150192 | false | false | 2025-02-04 | 2025-02-05 | 1 | Danielbrdz/Barcenas-14b-phi-4-v2 (Merge) |
Danielbrdz_Barcenas-3b-GRPO_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Danielbrdz/Barcenas-3b-GRPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Danielbrdz/Barcenas-3b-GRPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Danielbrdz__Barcenas-3b-GRPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Danielbrdz/Barcenas-3b-GRPO | 643e7615446a20d9ffe7cb66b88a6791cc6ae1eb | 20.565477 | llama3.2 | 0 | 3.213 | true | false | false | false | 0.617667 | 0.544428 | 54.442767 | 0.441435 | 21.136617 | 0.137462 | 13.746224 | 0.290268 | 5.369128 | 0.357594 | 6.065885 | 0.30369 | 22.63224 | false | false | 2025-02-08 | 2025-02-08 | 1 | Danielbrdz/Barcenas-3b-GRPO (Merge) |
Danielbrdz_Barcenas-Llama3-8b-ORPO_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Danielbrdz/Barcenas-Llama3-8b-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Danielbrdz/Barcenas-Llama3-8b-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Danielbrdz__Barcenas-Llama3-8b-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Danielbrdz/Barcenas-Llama3-8b-ORPO | 66c848c4526d3db1ec41468c0f73ac4448c6abe9 | 26.519005 | other | 7 | 8.03 | true | false | false | true | 1.548318 | 0.737243 | 73.724274 | 0.498656 | 28.600623 | 0.06571 | 6.570997 | 0.307047 | 7.606264 | 0.418958 | 11.169792 | 0.382979 | 31.44208 | false | false | 2024-04-29 | 2024-06-29 | 0 | Danielbrdz/Barcenas-Llama3-8b-ORPO |
Danielbrdz_Barcenas-R1-Qwen-1.5b_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/Danielbrdz/Barcenas-R1-Qwen-1.5b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Danielbrdz/Barcenas-R1-Qwen-1.5b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Danielbrdz__Barcenas-R1-Qwen-1.5b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Danielbrdz/Barcenas-R1-Qwen-1.5b | 10e2f6bd3bb254f7e4e6857ab2799aaa9c855876 | 15.138859 | mit | 0 | 1.777 | true | false | false | false | 1.222507 | 0.242801 | 24.280132 | 0.35872 | 10.49126 | 0.349698 | 34.969789 | 0.303691 | 7.158837 | 0.354125 | 3.832292 | 0.190908 | 10.100842 | false | false | 2025-01-26 | 2025-01-26 | 1 | Danielbrdz/Barcenas-R1-Qwen-1.5b (Merge) |
Dans-DiscountModels_12b-mn-dans-reasoning-test-2_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Dans-DiscountModels/12b-mn-dans-reasoning-test-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/12b-mn-dans-reasoning-test-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__12b-mn-dans-reasoning-test-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Dans-DiscountModels/12b-mn-dans-reasoning-test-2 | d573ad0cdb0ccfc194bc9c65dd81912dffeb1d35 | 15.564778 | 0 | 12.248 | true | false | false | true | 0.944729 | 0.371095 | 37.109536 | 0.480703 | 26.108938 | 0.063444 | 6.344411 | 0.27349 | 3.131991 | 0.370219 | 3.94401 | 0.250748 | 16.749778 | false | false | 2025-03-07 | 2025-03-07 | 0 | Dans-DiscountModels/12b-mn-dans-reasoning-test-2 |
|
Dans-DiscountModels_12b-mn-dans-reasoning-test-3_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Dans-DiscountModels/12b-mn-dans-reasoning-test-3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/12b-mn-dans-reasoning-test-3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__12b-mn-dans-reasoning-test-3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Dans-DiscountModels/12b-mn-dans-reasoning-test-3 | e64145422fe367c1d3cbf8403cdc9cd2c6ccd5ca | 19.131272 | 0 | 12.248 | true | false | false | true | 0.869661 | 0.505259 | 50.525938 | 0.483888 | 25.848641 | 0.077795 | 7.779456 | 0.270973 | 2.796421 | 0.41676 | 10.995052 | 0.251579 | 16.842125 | false | false | 2025-03-09 | 2025-03-10 | 0 | Dans-DiscountModels/12b-mn-dans-reasoning-test-3 |
|
Dans-DiscountModels_Dans-Instruct-CoreCurriculum-12b-ChatML_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Dans-DiscountModels/Dans-Instruct-CoreCurriculum-12b-ChatML" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/Dans-Instruct-CoreCurriculum-12b-ChatML</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__Dans-Instruct-CoreCurriculum-12b-ChatML-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Dans-DiscountModels/Dans-Instruct-CoreCurriculum-12b-ChatML | 56925fafe6a543e224db36864dd0927171542776 | 13.542858 | apache-2.0 | 0 | 12.248 | true | false | false | false | 4.636904 | 0.211102 | 21.11021 | 0.479186 | 26.046417 | 0.043051 | 4.305136 | 0.280201 | 4.026846 | 0.360635 | 5.71276 | 0.280502 | 20.055777 | false | false | 2024-09-04 | 2024-09-04 | 1 | mistralai/Mistral-Nemo-Base-2407 |
Dans-DiscountModels_Dans-Instruct-Mix-8b-ChatML_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__Dans-Instruct-Mix-8b-ChatML-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML | 029d84d4f4a618aa798490c046753b12801158e2 | 13.521356 | 0 | 8.03 | false | false | false | false | 1.597138 | 0.082508 | 8.250775 | 0.473817 | 26.336394 | 0.055136 | 5.513595 | 0.294463 | 5.928412 | 0.391823 | 9.677865 | 0.32879 | 25.421099 | false | false | 2024-09-14 | 0 | Removed |
||
Dans-DiscountModels_Dans-Instruct-Mix-8b-ChatML-V0.1.0_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__Dans-Instruct-Mix-8b-ChatML-V0.1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.1.0 | 9367c1273b0025793531fcf3a2c15416539f5d81 | 13.074907 | 0 | 8.03 | false | false | false | false | 1.629398 | 0.06682 | 6.682048 | 0.477477 | 26.737652 | 0.067221 | 6.722054 | 0.286074 | 4.809843 | 0.378583 | 8.122917 | 0.328374 | 25.374926 | false | false | 2024-09-20 | 0 | Removed |
||
Dans-DiscountModels_Dans-Instruct-Mix-8b-ChatML-V0.1.1_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__Dans-Instruct-Mix-8b-ChatML-V0.1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.1.1 | a6188cd1807d0d72e55adc371ddd198d7e9aa7ae | 13.349347 | 0 | 8.03 | false | false | false | false | 1.581177 | 0.091051 | 9.105063 | 0.474865 | 26.412551 | 0.059668 | 5.966767 | 0.291107 | 5.480984 | 0.38249 | 7.811198 | 0.327876 | 25.319518 | false | false | 2024-09-23 | 0 | Removed |
||
Dans-DiscountModels_Dans-Instruct-Mix-8b-ChatML-V0.2.0_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__Dans-Instruct-Mix-8b-ChatML-V0.2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.2.0 | 15a9988381fdba15281f1bd6b04c34f3f96120cc | 19.081856 | 0 | 8.03 | false | false | false | true | 1.687433 | 0.506409 | 50.640855 | 0.462426 | 24.734771 | 0.073263 | 7.326284 | 0.293624 | 5.816555 | 0.364448 | 3.75599 | 0.29995 | 22.216681 | false | false | 2024-09-30 | 0 | Removed |
||
Dans-DiscountModels_Mistral-7b-v0.3-Test-E0.7_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Dans-DiscountModels/Mistral-7b-v0.3-Test-E0.7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/Mistral-7b-v0.3-Test-E0.7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__Mistral-7b-v0.3-Test-E0.7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Dans-DiscountModels/Mistral-7b-v0.3-Test-E0.7 | e91ad0ada3f0d906bacd3c0ad41da4f65ce77b08 | 19.169864 | 0 | 7 | false | false | false | true | 0.875771 | 0.512354 | 51.235389 | 0.475022 | 26.820762 | 0.033988 | 3.398792 | 0.296141 | 6.152125 | 0.40051 | 8.030469 | 0.274435 | 19.381649 | false | false | 2024-11-15 | 0 | Removed |
||
Dans-DiscountModels_mistral-7b-test-merged_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/Dans-DiscountModels/mistral-7b-test-merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/mistral-7b-test-merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__mistral-7b-test-merged-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Dans-DiscountModels/mistral-7b-test-merged | 9db677cc43fb88852d952ef5914e919e65dd03eb | 22.073339 | apache-2.0 | 0 | 7 | true | false | false | true | 2.336846 | 0.6678 | 66.780033 | 0.489817 | 28.941005 | 0.044562 | 4.456193 | 0.294463 | 5.928412 | 0.375396 | 4.357813 | 0.297789 | 21.976581 | false | false | 2024-11-27 | 2024-11-30 | 1 | Dans-DiscountModels/mistral-7b-test-merged (Merge) |
Darkknight535_OpenCrystal-12B-L3_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/Darkknight535/OpenCrystal-12B-L3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Darkknight535/OpenCrystal-12B-L3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Darkknight535__OpenCrystal-12B-L3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | Darkknight535/OpenCrystal-12B-L3 | 974d2d453afdde40f6a993601bbbbf9d97b43606 | 20.685476 | 15 | 11.52 | false | false | false | false | 4.02457 | 0.407091 | 40.709096 | 0.52226 | 31.844491 | 0.089879 | 8.987915 | 0.306208 | 7.494407 | 0.365656 | 5.740365 | 0.364029 | 29.336584 | false | false | 2024-08-25 | 2024-08-26 | 0 | Darkknight535/OpenCrystal-12B-L3 |
|
DavidAU_DeepHermes-3-Llama-3-8B-Preview-16.5B-Brainstorm_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/DavidAU/DeepHermes-3-Llama-3-8B-Preview-16.5B-Brainstorm" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/DeepHermes-3-Llama-3-8B-Preview-16.5B-Brainstorm</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__DeepHermes-3-Llama-3-8B-Preview-16.5B-Brainstorm-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | DavidAU/DeepHermes-3-Llama-3-8B-Preview-16.5B-Brainstorm | e32bfdb8f5ac6f0fb644a1fcf91b0b82cadba260 | 18.452582 | 0 | 16.537 | false | false | false | false | 2.524828 | 0.313568 | 31.3568 | 0.476223 | 24.908754 | 0.10574 | 10.574018 | 0.313758 | 8.501119 | 0.392781 | 10.83099 | 0.320894 | 24.543809 | false | false | 2025-02-21 | 2025-03-10 | 1 | DavidAU/DeepHermes-3-Llama-3-8B-Preview-16.5B-Brainstorm (Merge) |
|
DavidAU_DeepSeek-BlackRoot-R1-Distill-Llama-3.1-8B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/DavidAU/DeepSeek-BlackRoot-R1-Distill-Llama-3.1-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/DeepSeek-BlackRoot-R1-Distill-Llama-3.1-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__DeepSeek-BlackRoot-R1-Distill-Llama-3.1-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | DavidAU/DeepSeek-BlackRoot-R1-Distill-Llama-3.1-8B | 17c6339702cda2eb3feb08aec58b8e681ac4e678 | 19.075555 | 1 | 8.03 | false | false | false | false | 0.717378 | 0.368498 | 36.849781 | 0.488694 | 27.616644 | 0.06571 | 6.570997 | 0.317953 | 9.060403 | 0.431979 | 12.397396 | 0.297623 | 21.958112 | false | false | 2025-02-15 | 2025-03-10 | 1 | DavidAU/DeepSeek-BlackRoot-R1-Distill-Llama-3.1-8B (Merge) |
|
DavidAU_DeepSeek-Grand-Horror-SMB-R1-Distill-Llama-3.1-16B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/DavidAU/DeepSeek-Grand-Horror-SMB-R1-Distill-Llama-3.1-16B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/DeepSeek-Grand-Horror-SMB-R1-Distill-Llama-3.1-16B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__DeepSeek-Grand-Horror-SMB-R1-Distill-Llama-3.1-16B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | DavidAU/DeepSeek-Grand-Horror-SMB-R1-Distill-Llama-3.1-16B | ffc26e5c5ffbf42976e5bdc13ea858127eb96cf7 | 14.762746 | 2 | 15.664 | false | false | false | false | 2.630745 | 0.250695 | 25.069482 | 0.448781 | 22.777139 | 0.029456 | 2.945619 | 0.313758 | 8.501119 | 0.416448 | 10.289323 | 0.270944 | 18.993794 | false | false | 2025-02-09 | 2025-03-10 | 1 | DavidAU/DeepSeek-Grand-Horror-SMB-R1-Distill-Llama-3.1-16B (Merge) |
|
DavidAU_DeepSeek-MOE-4X8B-R1-Distill-Llama-3.1-Deep-Thinker-Uncensored-24B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/DavidAU/DeepSeek-MOE-4X8B-R1-Distill-Llama-3.1-Deep-Thinker-Uncensored-24B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/DeepSeek-MOE-4X8B-R1-Distill-Llama-3.1-Deep-Thinker-Uncensored-24B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__DeepSeek-MOE-4X8B-R1-Distill-Llama-3.1-Deep-Thinker-Uncensored-24B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | DavidAU/DeepSeek-MOE-4X8B-R1-Distill-Llama-3.1-Deep-Thinker-Uncensored-24B | 95a95ccc16bb0d1c36a78e2bdf68bc60148608a3 | 20.033733 | 0 | 24.942 | false | true | false | false | 2.381683 | 0.388256 | 38.825649 | 0.488603 | 27.77355 | 0.081571 | 8.1571 | 0.322987 | 9.731544 | 0.4375 | 13.220833 | 0.302443 | 22.49372 | false | false | 2025-02-15 | 2025-03-10 | 1 | DavidAU/DeepSeek-MOE-4X8B-R1-Distill-Llama-3.1-Deep-Thinker-Uncensored-24B (Merge) |
|
DavidAU_DeepSeek-MOE-4X8B-R1-Distill-Llama-3.1-Mad-Scientist-24B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/DavidAU/DeepSeek-MOE-4X8B-R1-Distill-Llama-3.1-Mad-Scientist-24B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/DeepSeek-MOE-4X8B-R1-Distill-Llama-3.1-Mad-Scientist-24B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__DeepSeek-MOE-4X8B-R1-Distill-Llama-3.1-Mad-Scientist-24B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | DavidAU/DeepSeek-MOE-4X8B-R1-Distill-Llama-3.1-Mad-Scientist-24B | edd481ba969388f951af26d4a256538d02355342 | 18.805315 | 0 | 24.942 | false | true | false | false | 2.674696 | 0.343618 | 34.361827 | 0.476938 | 25.614434 | 0.075529 | 7.55287 | 0.337248 | 11.63311 | 0.423083 | 11.785417 | 0.296958 | 21.884235 | false | false | 2025-02-15 | 2025-03-10 | 1 | DavidAU/DeepSeek-MOE-4X8B-R1-Distill-Llama-3.1-Mad-Scientist-24B (Merge) |
|
DavidAU_DeepSeek-R1-Distill-Qwen-25.5B-Brainstorm_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/DavidAU/DeepSeek-R1-Distill-Qwen-25.5B-Brainstorm" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/DeepSeek-R1-Distill-Qwen-25.5B-Brainstorm</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__DeepSeek-R1-Distill-Qwen-25.5B-Brainstorm-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | DavidAU/DeepSeek-R1-Distill-Qwen-25.5B-Brainstorm | b96213180934664665855bae599d2a4c2023b68a | 35.281878 | 0 | 25.506 | false | false | false | false | 5.562347 | 0.341595 | 34.159475 | 0.58069 | 38.548491 | 0.553625 | 55.362538 | 0.385906 | 18.120805 | 0.51551 | 25.238802 | 0.46235 | 40.261155 | false | false | 2025-02-21 | 2025-03-10 | 1 | DavidAU/DeepSeek-R1-Distill-Qwen-25.5B-Brainstorm (Merge) |
|
DavidAU_DeepSeek-V2-Grand-Horror-SMB-R1-Distill-Llama-3.1-Uncensored-16.5B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/DavidAU/DeepSeek-V2-Grand-Horror-SMB-R1-Distill-Llama-3.1-Uncensored-16.5B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/DeepSeek-V2-Grand-Horror-SMB-R1-Distill-Llama-3.1-Uncensored-16.5B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__DeepSeek-V2-Grand-Horror-SMB-R1-Distill-Llama-3.1-Uncensored-16.5B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | DavidAU/DeepSeek-V2-Grand-Horror-SMB-R1-Distill-Llama-3.1-Uncensored-16.5B | 0b1829e4631ff716278c81dac4ed1cab655a3505 | 15.169196 | 0 | 16.537 | false | false | false | false | 2.791403 | 0.285316 | 28.531629 | 0.446238 | 22.878424 | 0.017372 | 1.73716 | 0.305369 | 7.38255 | 0.417875 | 10.734375 | 0.277759 | 19.751034 | false | false | 2025-02-09 | 2025-03-10 | 1 | DavidAU/DeepSeek-V2-Grand-Horror-SMB-R1-Distill-Llama-3.1-Uncensored-16.5B (Merge) |
|
DavidAU_DeepThought-MOE-8X3B-R1-Llama-3.2-Reasoning-18B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/DavidAU/DeepThought-MOE-8X3B-R1-Llama-3.2-Reasoning-18B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/DeepThought-MOE-8X3B-R1-Llama-3.2-Reasoning-18B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__DeepThought-MOE-8X3B-R1-Llama-3.2-Reasoning-18B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | DavidAU/DeepThought-MOE-8X3B-R1-Llama-3.2-Reasoning-18B | 39b96b8ceca904a96a5e3e524c2a9513f1850bdd | 16.128174 | 0 | 18.405 | false | true | false | false | 3.524037 | 0.379314 | 37.931355 | 0.42323 | 18.810857 | 0.108006 | 10.800604 | 0.279362 | 3.914989 | 0.355979 | 6.197396 | 0.272025 | 19.113845 | false | false | 2025-02-21 | 2025-03-10 | 1 | DavidAU/DeepThought-MOE-8X3B-R1-Llama-3.2-Reasoning-18B (Merge) |
|
DavidAU_Gemma-The-Writer-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/DavidAU/Gemma-The-Writer-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/Gemma-The-Writer-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__Gemma-The-Writer-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | DavidAU/Gemma-The-Writer-9B | fcd6c9a1d0f6acc5bffc7df72cd8e996a9573937 | 20.571195 | 5 | 10.159 | false | false | false | true | 3.968037 | 0.174032 | 17.403157 | 0.590544 | 41.272319 | 0.087613 | 8.761329 | 0.345638 | 12.751678 | 0.409875 | 10.134375 | 0.397939 | 33.104314 | false | false | 2024-09-26 | 2025-01-11 | 1 | DavidAU/Gemma-The-Writer-9B (Merge) |
|
DavidAU_Gemma-The-Writer-DEADLINE-10B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/DavidAU/Gemma-The-Writer-DEADLINE-10B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/Gemma-The-Writer-DEADLINE-10B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__Gemma-The-Writer-DEADLINE-10B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | DavidAU/Gemma-The-Writer-DEADLINE-10B | 69f38a595090ce6ba154b21d9d8b4c690f02b74e | 21.676641 | 0 | 10.952 | false | false | false | true | 5.19759 | 0.233158 | 23.315802 | 0.589609 | 41.019199 | 0.098943 | 9.89426 | 0.342282 | 12.304251 | 0.418865 | 10.791406 | 0.394614 | 32.734929 | false | false | 2024-10-27 | 2025-01-11 | 1 | DavidAU/Gemma-The-Writer-DEADLINE-10B (Merge) |
|
DavidAU_Gemma-The-Writer-J.GutenBerg-10B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/DavidAU/Gemma-The-Writer-J.GutenBerg-10B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/Gemma-The-Writer-J.GutenBerg-10B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__Gemma-The-Writer-J.GutenBerg-10B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | DavidAU/Gemma-The-Writer-J.GutenBerg-10B | 7318b14104e3eb06c8e571ec8a51c7f027834d74 | 22.350743 | 0 | 10.034 | false | false | false | true | 5.037345 | 0.285789 | 28.578948 | 0.590942 | 41.155991 | 0.092145 | 9.214502 | 0.338087 | 11.744966 | 0.417594 | 10.665885 | 0.394697 | 32.744164 | false | false | 2024-10-30 | 2025-01-11 | 1 | DavidAU/Gemma-The-Writer-J.GutenBerg-10B (Merge) |
|
DavidAU_Gemma-The-Writer-Mighty-Sword-9B_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/DavidAU/Gemma-The-Writer-Mighty-Sword-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/Gemma-The-Writer-Mighty-Sword-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__Gemma-The-Writer-Mighty-Sword-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | DavidAU/Gemma-The-Writer-Mighty-Sword-9B | 39e655b61e11cd9a53529c6bdf0e6357b5be6b2c | 32.033824 | 4 | 10.159 | false | false | false | true | 2.842983 | 0.752755 | 75.275491 | 0.591196 | 41.39261 | 0.191088 | 19.108761 | 0.348154 | 13.087248 | 0.411177 | 10.363802 | 0.396775 | 32.97503 | false | false | 2024-12-25 | 2025-01-11 | 1 | DavidAU/Gemma-The-Writer-Mighty-Sword-9B (Merge) |
Subsets and Splits