eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 7
values | T
stringclasses 7
values | Weight type
stringclasses 2
values | Architecture
stringclasses 64
values | Model
stringlengths 355
689
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 0.74
52.1
| Hub License
stringclasses 27
values | Hub ❤️
int64 0
6.09k
| #Params (B)
float64 -1
141
| Available on the hub
bool 2
classes | MoE
bool 2
classes | Flagged
bool 2
classes | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.04
187
| IFEval Raw
float64 0
0.9
| IFEval
float64 0
90
| BBH Raw
float64 0.22
0.83
| BBH
float64 0.25
76.7
| MATH Lvl 5 Raw
float64 0
0.71
| MATH Lvl 5
float64 0
71.5
| GPQA Raw
float64 0.21
0.47
| GPQA
float64 0
29.4
| MUSR Raw
float64 0.29
0.6
| MUSR
float64 0
38.7
| MMLU-PRO Raw
float64 0.1
0.73
| MMLU-PRO
float64 0
70
| Merged
bool 2
classes | Official Providers
bool 2
classes | Upload To Hub Date
stringclasses 525
values | Submission Date
stringclasses 263
values | Generation
int64 0
10
| Base Model
stringlengths 4
102
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
allknowingroger_Qwenslerp3-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Qwenslerp3-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Qwenslerp3-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Qwenslerp3-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Qwenslerp3-14B | ac60a6c4e224e5b52c42bebfd0cf81f920befdef | 38.080923 | apache-2.0 | 1 | 14.77 | true | false | false | false | 4.439262 | 0.505235 | 50.5235 | 0.652084 | 49.809829 | 0.446375 | 44.637462 | 0.375 | 16.666667 | 0.467604 | 18.017187 | 0.539478 | 48.830895 | true | false | 2024-10-19 | 2024-10-21 | 1 | allknowingroger/Qwenslerp3-14B (Merge) |
allknowingroger_Qwenslerp3-7B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Qwenslerp3-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Qwenslerp3-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Qwenslerp3-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Qwenslerp3-7B | 0351c5f6207cafd15e10e6d8dfe61b50d1b2378b | 30.63275 | apache-2.0 | 0 | 7.616 | true | false | false | false | 1.218891 | 0.501837 | 50.183735 | 0.558016 | 37.153984 | 0.321752 | 32.175227 | 0.324664 | 9.955257 | 0.45151 | 14.972135 | 0.454205 | 39.356161 | true | false | 2024-10-31 | 2024-11-04 | 1 | allknowingroger/Qwenslerp3-7B (Merge) |
allknowingroger_ROGERphi-7B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/ROGERphi-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/ROGERphi-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__ROGERphi-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/ROGERphi-7B-slerp | a92f90ae5e4286daa2399df4951a3347aaf414e1 | 20.707471 | apache-2.0 | 0 | 7.242 | true | false | false | false | 1.232699 | 0.386133 | 38.613324 | 0.519558 | 32.819032 | 0.073263 | 7.326284 | 0.288591 | 5.145414 | 0.468531 | 17.533073 | 0.305269 | 22.807698 | true | false | 2024-03-20 | 2024-06-26 | 1 | allknowingroger/ROGERphi-7B-slerp (Merge) |
allknowingroger_RogerMerge-7B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/RogerMerge-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/RogerMerge-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__RogerMerge-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/RogerMerge-7B-slerp | 397f5c0b52a536c130982ca2a7c3056358bbdf92 | 19.617736 | apache-2.0 | 0 | 7.242 | true | false | false | false | 1.235716 | 0.393302 | 39.330199 | 0.516018 | 31.987166 | 0.068731 | 6.873112 | 0.280201 | 4.026846 | 0.431979 | 12.930729 | 0.303025 | 22.558363 | true | false | 2024-04-11 | 2024-06-26 | 1 | allknowingroger/RogerMerge-7B-slerp (Merge) |
allknowingroger_Rombos-LLM-V2.5-Qwen-42b_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Rombos-LLM-V2.5-Qwen-42b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Rombos-LLM-V2.5-Qwen-42b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Rombos-LLM-V2.5-Qwen-42b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Rombos-LLM-V2.5-Qwen-42b | 977192ef80c5c904697f1d85d2eeab5db3947c65 | 4.559641 | 0 | 42.516 | false | false | false | false | 16.895765 | 0.187921 | 18.792138 | 0.296916 | 2.60764 | 0 | 0 | 0.262584 | 1.677852 | 0.363333 | 2.416667 | 0.116772 | 1.863549 | false | false | 2024-10-21 | 0 | Removed |
||
allknowingroger_Strangecoven-7B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Strangecoven-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Strangecoven-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Strangecoven-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Strangecoven-7B-slerp | 8bc9d8f972d15fdd3e02c602ef4f549493bf2208 | 20.311977 | apache-2.0 | 1 | 7.242 | true | false | false | false | 1.264938 | 0.374643 | 37.464261 | 0.536802 | 34.832235 | 0.076284 | 7.628399 | 0.28943 | 5.257271 | 0.419885 | 10.41901 | 0.336436 | 26.270686 | true | false | 2024-05-16 | 2024-06-26 | 1 | allknowingroger/Strangecoven-7B-slerp (Merge) |
allknowingroger_Weirdslerp2-25B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Weirdslerp2-25B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Weirdslerp2-25B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Weirdslerp2-25B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Weirdslerp2-25B | 4221341fe45e3ee6eaab27830b27d46bbbd5ea23 | 4.039649 | 0 | 25.204 | false | false | false | false | 3.415425 | 0.175407 | 17.540681 | 0.28737 | 1.565992 | 0 | 0 | 0.249161 | 0 | 0.352354 | 3.710937 | 0.112783 | 1.420287 | false | false | 2024-10-21 | 0 | Removed |
||
allknowingroger_WestlakeMaziyar-7B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/WestlakeMaziyar-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/WestlakeMaziyar-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__WestlakeMaziyar-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/WestlakeMaziyar-7B-slerp | 751534a844b0d439fe62f98bf8882fe9ab9872e0 | 22.183417 | apache-2.0 | 0 | 7.242 | true | false | false | false | 1.263341 | 0.483777 | 48.377749 | 0.524548 | 33.342811 | 0.066465 | 6.646526 | 0.303691 | 7.158837 | 0.447385 | 14.489844 | 0.307763 | 23.084737 | true | false | 2024-05-16 | 2024-06-26 | 1 | allknowingroger/WestlakeMaziyar-7B-slerp (Merge) |
allknowingroger_YamMaths-7B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/YamMaths-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/YamMaths-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__YamMaths-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/YamMaths-7B-slerp | bd4ac9d63ca88c80d34fa60ef5cbb56d60a39077 | 20.552307 | apache-2.0 | 1 | 7.242 | true | false | false | false | 1.225247 | 0.414809 | 41.480937 | 0.515585 | 32.133322 | 0.085347 | 8.534743 | 0.280201 | 4.026846 | 0.438365 | 13.46224 | 0.313082 | 23.675754 | true | false | 2024-06-02 | 2024-06-26 | 1 | allknowingroger/YamMaths-7B-slerp (Merge) |
allknowingroger_Yi-1.5-34B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Yi-1.5-34B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Yi-1.5-34B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Yi-1.5-34B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Yi-1.5-34B | fef96e380cb3aeecac8e2e53ad2c73a1187beb68 | 4.252733 | 0 | 34.389 | false | false | false | false | 10.504515 | 0.163916 | 16.391619 | 0.282725 | 1.339043 | 0 | 0 | 0.258389 | 1.118568 | 0.385656 | 5.607031 | 0.109541 | 1.060136 | false | false | 2024-10-21 | 0 | Removed |
||
allknowingroger_Yi-blossom-40B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Yi-blossom-40B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Yi-blossom-40B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Yi-blossom-40B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Yi-blossom-40B | d1bf1cf9339808193c5a56ef23fecdfd1012acfb | 5.827458 | 0 | 18.769 | false | false | false | false | 1.976915 | 0.200886 | 20.088587 | 0.321504 | 5.539183 | 0 | 0 | 0.274329 | 3.243848 | 0.38426 | 5.199219 | 0.108045 | 0.893913 | false | false | 2024-09-19 | 0 | Removed |
||
allknowingroger_Yibuddy-35B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Yibuddy-35B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Yibuddy-35B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Yibuddy-35B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Yibuddy-35B | 592e1e52b97ec88a80ba3b496c19f2498ada4ea3 | 28.283171 | apache-2.0 | 0 | 34.389 | true | false | false | false | 10.760125 | 0.423477 | 42.347748 | 0.591619 | 42.808242 | 0.1571 | 15.70997 | 0.355705 | 14.09396 | 0.450458 | 15.973958 | 0.448886 | 38.765145 | true | false | 2024-09-17 | 2024-10-08 | 1 | allknowingroger/Yibuddy-35B (Merge) |
allknowingroger_Yillama-40B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Yillama-40B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Yillama-40B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Yillama-40B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Yillama-40B | 65db687755e716481a218cac99d20619d78e41f7 | 8.311487 | 0 | 34.389 | false | false | false | false | 7.048132 | 0.169686 | 16.968643 | 0.406289 | 15.875797 | 0 | 0 | 0.282718 | 4.362416 | 0.350063 | 1.757812 | 0.198138 | 10.904255 | false | false | 2024-09-19 | 0 | Removed |
||
allknowingroger_Yislerp-34B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Yislerp-34B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Yislerp-34B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Yislerp-34B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Yislerp-34B | 131ad918edd652271510ee8dba63d3e7319df133 | 29.398926 | apache-2.0 | 0 | 34.389 | true | false | false | false | 5.977556 | 0.369197 | 36.919706 | 0.615872 | 45.981696 | 0.216012 | 21.601208 | 0.358221 | 14.42953 | 0.456625 | 15.778125 | 0.47515 | 41.683289 | true | false | 2024-09-16 | 2024-09-19 | 1 | allknowingroger/Yislerp-34B (Merge) |
allknowingroger_Yislerp2-34B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Yislerp2-34B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Yislerp2-34B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Yislerp2-34B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Yislerp2-34B | 3147cf866736b786347928b655c887e8b9c07bfc | 30.433865 | apache-2.0 | 0 | 34.389 | true | false | false | false | 9.10231 | 0.399947 | 39.994659 | 0.624577 | 47.202306 | 0.229607 | 22.960725 | 0.364094 | 15.212528 | 0.452969 | 15.854427 | 0.472407 | 41.378546 | true | false | 2024-09-17 | 2024-10-08 | 1 | allknowingroger/Yislerp2-34B (Merge) |
allknowingroger_Yunconglong-13B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Yunconglong-13B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Yunconglong-13B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Yunconglong-13B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Yunconglong-13B-slerp | dead687b7342d875bd8ac73bfcd34b88a2e5564c | 19.600104 | 0 | 12.879 | false | false | false | false | 1.599263 | 0.424177 | 42.417674 | 0.516581 | 32.140729 | 0.054381 | 5.438066 | 0.28104 | 4.138702 | 0.416073 | 10.842448 | 0.303607 | 22.623005 | false | false | 2024-06-26 | 0 | Removed |
||
allknowingroger_limyClown-7B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/limyClown-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/limyClown-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__limyClown-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/limyClown-7B-slerp | 732a1ed0c2c7007297ad9d9797793073825f65ca | 19.703889 | apache-2.0 | 0 | 7.242 | true | false | false | false | 1.220101 | 0.401745 | 40.174515 | 0.514752 | 31.931466 | 0.068731 | 6.873112 | 0.28104 | 4.138702 | 0.429313 | 12.464063 | 0.303773 | 22.641475 | true | false | 2024-03-23 | 2024-06-26 | 1 | allknowingroger/limyClown-7B-slerp (Merge) |
allknowingroger_llama3-Jallabi-40B-s_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/llama3-Jallabi-40B-s" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/llama3-Jallabi-40B-s</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__llama3-Jallabi-40B-s-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/llama3-Jallabi-40B-s | a86d8cc3530fb466245b2cac55f25c28d0bd8c22 | 5.029702 | 0 | 18.769 | false | false | false | false | 1.959149 | 0.192068 | 19.206816 | 0.325224 | 5.957912 | 0 | 0 | 0.237416 | 0 | 0.374958 | 4.036458 | 0.108793 | 0.977024 | false | false | 2024-09-19 | 0 | Removed |
||
allknowingroger_llama3AnFeng-40B_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/llama3AnFeng-40B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/llama3AnFeng-40B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__llama3AnFeng-40B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/llama3AnFeng-40B | 5995441962287970ffc98ad9b292e14420bf49ca | 9.237994 | 0 | 39.971 | false | false | false | false | 8.126936 | 0.174208 | 17.420777 | 0.379408 | 12.476996 | 0 | 0 | 0.306208 | 7.494407 | 0.394 | 7.15 | 0.197972 | 10.885786 | false | false | 2024-09-19 | 0 | Removed |
||
allura-org_L3.1-8b-RP-Ink_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allura-org/L3.1-8b-RP-Ink" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allura-org/L3.1-8b-RP-Ink</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allura-org__L3.1-8b-RP-Ink-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allura-org/L3.1-8b-RP-Ink | 5d487fff4e2d4ae18193b843484b3bd21d09b07c | 25.096017 | llama3.1 | 3 | 8.03 | true | false | false | true | 0.668821 | 0.781106 | 78.110635 | 0.482847 | 26.318229 | 0.148036 | 14.803625 | 0.264262 | 1.901566 | 0.360823 | 2.469531 | 0.342753 | 26.972518 | false | false | 2025-01-26 | 2025-02-23 | 1 | allura-org/L3.1-8b-RP-Ink (Merge) |
allura-org_MN-12b-RP-Ink_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allura-org/MN-12b-RP-Ink" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allura-org/MN-12b-RP-Ink</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allura-org__MN-12b-RP-Ink-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allura-org/MN-12b-RP-Ink | 812fe1585cdf347284bd82b24e09a5308b899f71 | 24.976661 | apache-2.0 | 10 | 12.248 | true | false | false | true | 0.829416 | 0.718633 | 71.863323 | 0.483383 | 26.610598 | 0.11858 | 11.858006 | 0.285235 | 4.697987 | 0.381844 | 6.897135 | 0.351396 | 27.93292 | false | false | 2024-12-25 | 2025-02-23 | 1 | allura-org/MN-12b-RP-Ink (Merge) |
allura-org_MS-Meadowlark-22B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allura-org/MS-Meadowlark-22B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allura-org/MS-Meadowlark-22B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allura-org__MS-Meadowlark-22B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allura-org/MS-Meadowlark-22B | 6eb2f6bee66dbffa1b17397e75a7380ed4f9d0ac | 27.097964 | other | 13 | 22.247 | true | false | false | true | 4.343852 | 0.669699 | 66.969862 | 0.516258 | 30.29658 | 0.183535 | 18.353474 | 0.325503 | 10.067114 | 0.38426 | 5.532552 | 0.382314 | 31.368203 | true | false | 2024-10-18 | 2024-10-24 | 1 | allura-org/MS-Meadowlark-22B (Merge) |
allura-org_Mistral-Small-24b-Sertraline-0304_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allura-org/Mistral-Small-24b-Sertraline-0304" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allura-org/Mistral-Small-24b-Sertraline-0304</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allura-org__Mistral-Small-24b-Sertraline-0304-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allura-org/Mistral-Small-24b-Sertraline-0304 | 26c68b8c4de900ffc392567961d4f516b384a077 | 35.369805 | apache-2.0 | 3 | 23.572 | true | false | false | true | 1.470543 | 0.67999 | 67.99902 | 0.652491 | 49.281458 | 0.22281 | 22.280967 | 0.35151 | 13.534676 | 0.43951 | 13.505469 | 0.510555 | 45.617243 | false | false | 2025-03-04 | 2025-03-04 | 1 | allura-org/Mistral-Small-24b-Sertraline-0304 (Merge) |
allura-org_Mistral-Small-Sisyphus-24b-2503_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allura-org/Mistral-Small-Sisyphus-24b-2503" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allura-org/Mistral-Small-Sisyphus-24b-2503</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allura-org__Mistral-Small-Sisyphus-24b-2503-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allura-org/Mistral-Small-Sisyphus-24b-2503 | fa6d90550cad060664c47aac56b511e6584262a0 | 32.5029 | apache-2.0 | 8 | 23.572 | true | false | false | true | 2.743696 | 0.684836 | 68.483623 | 0.626979 | 46.420978 | 0.25 | 25 | 0.262584 | 1.677852 | 0.397687 | 7.577604 | 0.512716 | 45.857343 | false | false | 2025-03-03 | 2025-03-03 | 1 | allura-org/Mistral-Small-Sisyphus-24b-2503 (Merge) |
allura-org_MoE-Girl-1BA-7BT_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | OlmoeForCausalLM | <a target="_blank" href="https://huggingface.co/allura-org/MoE-Girl-1BA-7BT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allura-org/MoE-Girl-1BA-7BT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allura-org__MoE-Girl-1BA-7BT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allura-org/MoE-Girl-1BA-7BT | ecfac73ab9e7f2ee006d6a2ad9c8e86a85deab2b | 6.402799 | apache-2.0 | 15 | 6.919 | true | true | false | true | 6.402309 | 0.270503 | 27.050338 | 0.313918 | 4.842344 | 0.015106 | 1.510574 | 0.258389 | 1.118568 | 0.343552 | 1.477344 | 0.121759 | 2.417627 | false | false | 2024-10-08 | 2024-10-10 | 1 | allenai/OLMoE-1B-7B-0924 |
allura-org_TQ2.5-14B-Aletheia-v1_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allura-org/TQ2.5-14B-Aletheia-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allura-org/TQ2.5-14B-Aletheia-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allura-org__TQ2.5-14B-Aletheia-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allura-org/TQ2.5-14B-Aletheia-v1 | c7fbe91dbdb85161464f87c261b588dbf674e514 | 39.482472 | apache-2.0 | 5 | 14.77 | true | false | false | true | 2.825806 | 0.75303 | 75.302974 | 0.658507 | 50.881442 | 0.339879 | 33.987915 | 0.362416 | 14.988814 | 0.445156 | 14.611198 | 0.524102 | 47.122488 | true | false | 2024-12-19 | 2024-12-29 | 1 | allura-org/TQ2.5-14B-Aletheia-v1 (Merge) |
allura-org_TQ2.5-14B-Neon-v1_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allura-org/TQ2.5-14B-Neon-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allura-org/TQ2.5-14B-Neon-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allura-org__TQ2.5-14B-Neon-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allura-org/TQ2.5-14B-Neon-v1 | f83a719a5c02c1b9ec5225585978d1f1595f8da7 | 39.14031 | apache-2.0 | 10 | 14.77 | true | false | false | true | 1.483962 | 0.675419 | 67.5419 | 0.655304 | 50.510093 | 0.360272 | 36.02719 | 0.371644 | 16.219239 | 0.461 | 17.291667 | 0.525266 | 47.251773 | false | false | 2024-12-19 | 2025-02-17 | 2 | arcee-ai/SuperNova-Medius (Merge) |
allura-org_Teleut-7b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allura-org/Teleut-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allura-org/Teleut-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allura-org__Teleut-7b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allura-org/Teleut-7b | 259e5b8b84d8ddee69db34cdd237ce5ac5c8c4cf | 30.229494 | apache-2.0 | 12 | 7.616 | true | false | false | true | 2.100197 | 0.637875 | 63.787528 | 0.514128 | 30.859919 | 0.240937 | 24.093656 | 0.326342 | 10.178971 | 0.464042 | 17.671875 | 0.413065 | 34.785018 | false | false | 2024-11-24 | 2025-01-01 | 1 | Qwen/Qwen2.5-7B |
aloobun_Meta-Llama-3-7B-28Layers_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/aloobun/Meta-Llama-3-7B-28Layers" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aloobun/Meta-Llama-3-7B-28Layers</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/aloobun__Meta-Llama-3-7B-28Layers-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | aloobun/Meta-Llama-3-7B-28Layers | 9822e6b8d4de0c0f2964d299f6fcef72385a0341 | 13.37569 | llama3 | 0 | 7.158 | true | false | false | false | 1.618717 | 0.196365 | 19.636453 | 0.44375 | 22.09653 | 0.027946 | 2.794562 | 0.294463 | 5.928412 | 0.358927 | 5.799219 | 0.315991 | 23.998966 | true | false | 2024-05-10 | 2024-06-26 | 1 | aloobun/Meta-Llama-3-7B-28Layers (Merge) |
aloobun_d-SmolLM2-360M_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/aloobun/d-SmolLM2-360M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aloobun/d-SmolLM2-360M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/aloobun__d-SmolLM2-360M-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | aloobun/d-SmolLM2-360M | 2a1d82b4cbcdfdff3c2cfcd171435c5f01b8de43 | 6.184071 | apache-2.0 | 1 | 0.362 | true | false | false | false | 0.740247 | 0.209704 | 20.970359 | 0.319578 | 4.762821 | 0.01284 | 1.283988 | 0.253356 | 0.447427 | 0.398063 | 7.757813 | 0.116938 | 1.882018 | false | false | 2024-11-20 | 2024-11-26 | 0 | aloobun/d-SmolLM2-360M |
alpindale_WizardLM-2-8x22B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/alpindale/WizardLM-2-8x22B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">alpindale/WizardLM-2-8x22B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/alpindale__WizardLM-2-8x22B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | alpindale/WizardLM-2-8x22B | 087834da175523cffd66a7e19583725e798c1b4f | 33.059052 | apache-2.0 | 399 | 140.621 | true | false | false | false | 186.610443 | 0.527217 | 52.721667 | 0.637731 | 48.576168 | 0.25 | 25 | 0.381711 | 17.561521 | 0.438708 | 14.538542 | 0.459608 | 39.956413 | false | false | 2024-04-16 | 2024-06-28 | 0 | alpindale/WizardLM-2-8x22B |
alpindale_magnum-72b-v1_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/alpindale/magnum-72b-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">alpindale/magnum-72b-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/alpindale__magnum-72b-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | alpindale/magnum-72b-v1 | fef27e0f235ae8858b84b765db773a2a954110dd | 42.929055 | other | 166 | 72.706 | true | false | false | true | 25.030245 | 0.760648 | 76.064841 | 0.698222 | 57.653185 | 0.398036 | 39.803625 | 0.39094 | 18.791946 | 0.448938 | 15.617187 | 0.546792 | 49.643543 | false | false | 2024-06-17 | 2024-07-25 | 2 | Qwen/Qwen2-72B |
altomek_YiSM-34B-0rn_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/altomek/YiSM-34B-0rn" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">altomek/YiSM-34B-0rn</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/altomek__YiSM-34B-0rn-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | altomek/YiSM-34B-0rn | 7a481c67cbdd5c846d6aaab5ef9f1eebfad812c2 | 30.512012 | apache-2.0 | 1 | 34.389 | true | false | false | true | 5.921248 | 0.428373 | 42.837338 | 0.614001 | 45.382927 | 0.228097 | 22.809668 | 0.371644 | 16.219239 | 0.445 | 14.758333 | 0.469581 | 41.064569 | true | false | 2024-05-26 | 2024-06-27 | 1 | altomek/YiSM-34B-0rn (Merge) |
amazon_MegaBeam-Mistral-7B-300k_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/amazon/MegaBeam-Mistral-7B-300k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">amazon/MegaBeam-Mistral-7B-300k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/amazon__MegaBeam-Mistral-7B-300k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | amazon/MegaBeam-Mistral-7B-300k | 42572e5c9a0747b19af5c5c9962d122622f32295 | 17.022471 | apache-2.0 | 16 | 7.242 | true | false | false | true | 1.29922 | 0.520347 | 52.034712 | 0.422773 | 19.291806 | 0.021148 | 2.114804 | 0.27349 | 3.131991 | 0.398 | 8.35 | 0.254904 | 17.21151 | false | false | 2024-05-13 | 2024-10-07 | 0 | amazon/MegaBeam-Mistral-7B-300k |
amd_AMD-Llama-135m_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/amd/AMD-Llama-135m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">amd/AMD-Llama-135m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/amd__AMD-Llama-135m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | amd/AMD-Llama-135m | 8f9c39b5ed86d422ab332ed1ecf042fdaeb57903 | 4.759627 | apache-2.0 | 111 | 0.135 | true | false | false | false | 0.128719 | 0.184225 | 18.422452 | 0.297393 | 2.485495 | 0.005287 | 0.528701 | 0.252517 | 0.33557 | 0.377969 | 4.91276 | 0.116855 | 1.872784 | false | true | 2024-07-19 | 2024-09-29 | 0 | amd/AMD-Llama-135m |
amd_AMD-Llama-135m_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/amd/AMD-Llama-135m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">amd/AMD-Llama-135m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/amd__AMD-Llama-135m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | amd/AMD-Llama-135m | 8f9c39b5ed86d422ab332ed1ecf042fdaeb57903 | 5.228977 | apache-2.0 | 111 | 0.134 | true | false | false | false | 0.708678 | 0.191843 | 19.18432 | 0.296944 | 2.537953 | 0.007553 | 0.755287 | 0.258389 | 1.118568 | 0.384573 | 5.904948 | 0.116855 | 1.872784 | false | true | 2024-07-19 | 2024-10-01 | 0 | amd/AMD-Llama-135m |
anakin87_gemma-2b-orpo_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | GemmaForCausalLM | <a target="_blank" href="https://huggingface.co/anakin87/gemma-2b-orpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anakin87/gemma-2b-orpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/anakin87__gemma-2b-orpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | anakin87/gemma-2b-orpo | bf6bfe30c31c18620767ad60d0bff89343804230 | 7.284706 | other | 28 | 2.506 | true | false | false | true | 1.579853 | 0.247797 | 24.779696 | 0.342617 | 7.949445 | 0.018882 | 1.888218 | 0.261745 | 1.565996 | 0.37276 | 4.128385 | 0.130568 | 3.396498 | false | false | 2024-03-24 | 2024-07-06 | 1 | google/gemma-2b |
anthracite-org_magnum-v1-72b_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/anthracite-org/magnum-v1-72b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anthracite-org/magnum-v1-72b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/anthracite-org__magnum-v1-72b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | anthracite-org/magnum-v1-72b | f8f85021bace7e8250ed8559c5b78b8b34f0c4cc | 42.962915 | other | 166 | 72.706 | true | false | false | true | 25.782112 | 0.760648 | 76.064841 | 0.698222 | 57.653185 | 0.398036 | 39.803625 | 0.39094 | 18.791946 | 0.448938 | 15.617187 | 0.54862 | 49.846705 | false | false | 2024-06-17 | 2024-09-21 | 2 | Qwen/Qwen2-72B |
anthracite-org_magnum-v2-12b_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/anthracite-org/magnum-v2-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anthracite-org/magnum-v2-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/anthracite-org__magnum-v2-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | anthracite-org/magnum-v2-12b | 18.795822 | apache-2.0 | 86 | 12.248 | true | false | false | true | 3.297671 | 0.376166 | 37.616635 | 0.502086 | 28.785552 | 0.054381 | 5.438066 | 0.291107 | 5.480984 | 0.417906 | 11.371615 | 0.316739 | 24.082077 | false | false | 2024-08-03 | 2024-09-05 | 1 | mistralai/Mistral-Nemo-Base-2407 |
|
anthracite-org_magnum-v2-72b_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/anthracite-org/magnum-v2-72b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anthracite-org/magnum-v2-72b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/anthracite-org__magnum-v2-72b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | anthracite-org/magnum-v2-72b | c9c5826ef42b9fcc8a8e1079be574481cf0b6cc6 | 41.782872 | other | 38 | 72.706 | true | false | false | true | 24.268434 | 0.756027 | 75.602734 | 0.700508 | 57.854704 | 0.35423 | 35.422961 | 0.385906 | 18.120805 | 0.437188 | 14.181771 | 0.545628 | 49.514258 | false | false | 2024-08-18 | 2024-09-05 | 2 | Qwen/Qwen2-72B |
anthracite-org_magnum-v2.5-12b-kto_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/anthracite-org/magnum-v2.5-12b-kto" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anthracite-org/magnum-v2.5-12b-kto</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/anthracite-org__magnum-v2.5-12b-kto-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | anthracite-org/magnum-v2.5-12b-kto | aee0374e5a43e950c9977b0004dede1c57be2999 | 18.98279 | apache-2.0 | 53 | 12.248 | true | false | false | true | 3.218126 | 0.386558 | 38.655767 | 0.507696 | 29.625059 | 0.052115 | 5.21148 | 0.293624 | 5.816555 | 0.408635 | 9.979427 | 0.321476 | 24.608452 | false | false | 2024-08-12 | 2024-08-29 | 2 | mistralai/Mistral-Nemo-Base-2407 |
anthracite-org_magnum-v3-27b-kto_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/anthracite-org/magnum-v3-27b-kto" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anthracite-org/magnum-v3-27b-kto</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/anthracite-org__magnum-v3-27b-kto-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | anthracite-org/magnum-v3-27b-kto | 96fbb750b3150e5fe9d6d2fcf757f49310d99a43 | 29.33708 | gemma | 15 | 27.227 | true | false | false | true | 7.875068 | 0.567483 | 56.748317 | 0.586041 | 41.160103 | 0.181269 | 18.126888 | 0.355705 | 14.09396 | 0.385469 | 9.916927 | 0.423787 | 35.976285 | false | false | 2024-09-06 | 2024-09-15 | 1 | anthracite-org/magnum-v3-27b-kto (Merge) |
anthracite-org_magnum-v3-34b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/anthracite-org/magnum-v3-34b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anthracite-org/magnum-v3-34b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/anthracite-org__magnum-v3-34b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | anthracite-org/magnum-v3-34b | 3bcd8c3dbb93021a5ce22203c690a1a084cafb73 | 29.666081 | apache-2.0 | 29 | 34.389 | true | false | false | true | 9.225749 | 0.511529 | 51.152941 | 0.608783 | 44.327903 | 0.194864 | 19.486405 | 0.360738 | 14.765101 | 0.38724 | 6.571615 | 0.475233 | 41.692524 | false | false | 2024-08-22 | 2024-09-18 | 0 | anthracite-org/magnum-v3-34b |
anthracite-org_magnum-v3-9b-chatml_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/anthracite-org/magnum-v3-9b-chatml" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anthracite-org/magnum-v3-9b-chatml</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/anthracite-org__magnum-v3-9b-chatml-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | anthracite-org/magnum-v3-9b-chatml | 96c2d023c56ef73be095ffbae8cedd7243ebca84 | 19.504116 | gemma | 24 | 9.242 | true | false | false | false | 5.779931 | 0.127471 | 12.747067 | 0.542769 | 35.317875 | 0.069486 | 6.94864 | 0.345638 | 12.751678 | 0.443229 | 13.236979 | 0.424202 | 36.022459 | false | false | 2024-08-27 | 2024-09-18 | 1 | IntervitensInc/gemma-2-9b-chatml |
anthracite-org_magnum-v3-9b-customgemma2_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/anthracite-org/magnum-v3-9b-customgemma2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anthracite-org/magnum-v3-9b-customgemma2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/anthracite-org__magnum-v3-9b-customgemma2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | anthracite-org/magnum-v3-9b-customgemma2 | 9a7cd3d47434bed2bd80e34e45c74e413f8baaa8 | 19.200267 | gemma | 19 | 9.242 | true | false | false | false | 5.80365 | 0.127296 | 12.729558 | 0.534014 | 34.116783 | 0.071752 | 7.175227 | 0.328859 | 10.514541 | 0.456469 | 15.058594 | 0.420462 | 35.6069 | false | false | 2024-08-27 | 2024-09-18 | 1 | google/gemma-2-9b |
anthracite-org_magnum-v4-12b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/anthracite-org/magnum-v4-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anthracite-org/magnum-v4-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/anthracite-org__magnum-v4-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | anthracite-org/magnum-v4-12b | 704f2ccfe662052e415499e56789dd88ec01a113 | 20.276427 | apache-2.0 | 39 | 12.248 | true | false | false | false | 3.398031 | 0.339296 | 33.92964 | 0.517669 | 30.503902 | 0.117825 | 11.782477 | 0.296141 | 6.152125 | 0.409281 | 10.360156 | 0.360372 | 28.93026 | false | false | 2024-10-20 | 2024-10-23 | 0 | anthracite-org/magnum-v4-12b |
anthracite-org_magnum-v4-22b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/anthracite-org/magnum-v4-22b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anthracite-org/magnum-v4-22b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/anthracite-org__magnum-v4-22b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | anthracite-org/magnum-v4-22b | e5239e71d2628269b453a832de98c1ecb79d2557 | 27.85437 | other | 26 | 22.247 | true | false | false | false | 3.30058 | 0.562862 | 56.286209 | 0.548612 | 35.549149 | 0.200151 | 20.015106 | 0.32802 | 10.402685 | 0.440781 | 13.43099 | 0.382979 | 31.44208 | false | false | 2024-10-20 | 2024-10-23 | 0 | anthracite-org/magnum-v4-22b |
anthracite-org_magnum-v4-27b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/anthracite-org/magnum-v4-27b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anthracite-org/magnum-v4-27b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/anthracite-org__magnum-v4-27b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | anthracite-org/magnum-v4-27b | 50a14716bdeb6a9376b9377df31ab1497864f3f9 | 26.633004 | gemma | 18 | 27.227 | true | false | false | false | 11.472708 | 0.345417 | 34.541683 | 0.58673 | 40.960384 | 0.179758 | 17.975831 | 0.369966 | 15.995526 | 0.43799 | 12.815365 | 0.437583 | 37.509235 | false | false | 2024-10-20 | 2024-10-23 | 0 | anthracite-org/magnum-v4-27b |
anthracite-org_magnum-v4-9b_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/anthracite-org/magnum-v4-9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">anthracite-org/magnum-v4-9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/anthracite-org__magnum-v4-9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | anthracite-org/magnum-v4-9b | e9db6cb80f02ca2e2db4538ef59f7a30f69a849d | 23.798995 | gemma | 17 | 9.242 | true | false | false | false | 5.112652 | 0.350263 | 35.026286 | 0.533642 | 33.270404 | 0.130665 | 13.066465 | 0.347315 | 12.975391 | 0.451573 | 15.646615 | 0.395279 | 32.808806 | false | false | 2024-10-20 | 2024-10-23 | 0 | anthracite-org/magnum-v4-9b |
apple_DCLM-7B_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | OpenLMModel | <a target="_blank" href="https://huggingface.co/apple/DCLM-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">apple/DCLM-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/apple__DCLM-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | apple/DCLM-7B | c85bfa168f999ce27e954808bc005a2748fda5c5 | 14.112858 | apple-ascl | 834 | 7 | true | false | false | false | 1.259911 | 0.217272 | 21.727239 | 0.423214 | 19.760935 | 0.037009 | 3.700906 | 0.315436 | 8.724832 | 0.392073 | 7.309115 | 0.311087 | 23.454122 | false | false | 2024-07-11 | 2024-08-16 | 0 | apple/DCLM-7B |
appvoid_arco-2_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/appvoid/arco-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">appvoid/arco-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/appvoid__arco-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | appvoid/arco-2 | 9bec3c42c5bb557eb218513f4fe26c4edc803f0f | 5.137101 | apache-2.0 | 7 | 0.514 | true | false | false | false | 0.316294 | 0.199137 | 19.913718 | 0.314567 | 4.05915 | 0.013595 | 1.359517 | 0.239094 | 0 | 0.353594 | 4.199219 | 0.111619 | 1.291002 | false | false | 2024-09-22 | 2024-12-23 | 0 | appvoid/arco-2 |
appvoid_arco-2-instruct_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/appvoid/arco-2-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">appvoid/arco-2-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/appvoid__arco-2-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | appvoid/arco-2-instruct | eb116cfdf0b239d67a874d2bd37b2b748d7d6654 | 5.382511 | apache-2.0 | 0 | 0.514 | true | false | false | false | 0.337533 | 0.216448 | 21.644791 | 0.313305 | 3.913002 | 0.01284 | 1.283988 | 0.238255 | 0 | 0.349594 | 4.199219 | 0.111287 | 1.254063 | true | false | 2024-12-14 | 2024-12-27 | 1 | appvoid/arco-2-instruct (Merge) |
arcee-ai_Arcee-Blitz_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/arcee-ai/Arcee-Blitz" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arcee-ai/Arcee-Blitz</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/arcee-ai__Arcee-Blitz-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | arcee-ai/Arcee-Blitz | cf20caa2bfbfabc71a79c6e73f1a7b1e59c86a9b | 40.012327 | apache-2.0 | 68 | 23.572 | true | false | false | false | 1.444665 | 0.554344 | 55.434359 | 0.660663 | 50.726633 | 0.348187 | 34.818731 | 0.385067 | 18.008949 | 0.504719 | 23.823177 | 0.615359 | 57.262116 | false | false | 2025-02-07 | 2025-02-21 | 1 | arcee-ai/Arcee-Blitz (Merge) |
arcee-ai_Arcee-Maestro-7B-Preview_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/arcee-ai/Arcee-Maestro-7B-Preview" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arcee-ai/Arcee-Maestro-7B-Preview</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/arcee-ai__Arcee-Maestro-7B-Preview-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | arcee-ai/Arcee-Maestro-7B-Preview | 007a65e79e9631b6842a5db89a9bc13936fd3aca | 23.793131 | apache-2.0 | 36 | 7.613 | true | false | false | false | 0.690343 | 0.275025 | 27.502471 | 0.464837 | 25.375556 | 0.499245 | 49.924471 | 0.332215 | 10.961969 | 0.388542 | 6.334375 | 0.303939 | 22.659944 | false | false | 2025-02-10 | 2025-02-21 | 1 | arcee-ai/Arcee-Maestro-7B-Preview (Merge) |
arcee-ai_Arcee-Nova_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/arcee-ai/Arcee-Nova" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arcee-ai/Arcee-Nova</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/arcee-ai__Arcee-Nova-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | arcee-ai/Arcee-Nova | ec3bfe88b83f81481daa04b6789c1e0d32827dc5 | 44.053393 | other | 50 | 72.706 | true | false | false | true | 22.986587 | 0.790749 | 79.074855 | 0.694197 | 56.740988 | 0.438066 | 43.806647 | 0.385067 | 18.008949 | 0.456167 | 17.220833 | 0.545213 | 49.468085 | false | false | 2024-07-16 | 2024-09-19 | 0 | arcee-ai/Arcee-Nova |
arcee-ai_Arcee-Spark_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/arcee-ai/Arcee-Spark" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arcee-ai/Arcee-Spark</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/arcee-ai__Arcee-Spark-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | arcee-ai/Arcee-Spark | 3fe368ea5fd32bc4a8d1bcf42510416f7fa28668 | 28.406546 | apache-2.0 | 87 | 7.616 | true | false | false | true | 2.197066 | 0.562087 | 56.208748 | 0.548947 | 37.138522 | 0.295317 | 29.531722 | 0.307047 | 7.606264 | 0.402094 | 8.595052 | 0.382231 | 31.358969 | false | false | 2024-06-22 | 2024-06-26 | 0 | arcee-ai/Arcee-Spark |
arcee-ai_Arcee-Spark_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/arcee-ai/Arcee-Spark" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arcee-ai/Arcee-Spark</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/arcee-ai__Arcee-Spark-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | arcee-ai/Arcee-Spark | 3fe368ea5fd32bc4a8d1bcf42510416f7fa28668 | 25.443169 | apache-2.0 | 87 | 7.616 | true | false | false | true | 1.13604 | 0.571829 | 57.182941 | 0.548086 | 36.92439 | 0.114048 | 11.404834 | 0.306208 | 7.494407 | 0.40076 | 8.395052 | 0.381316 | 31.257388 | false | false | 2024-06-22 | 2024-06-26 | 0 | arcee-ai/Arcee-Spark |
arcee-ai_Llama-3.1-SuperNova-Lite_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/arcee-ai/Llama-3.1-SuperNova-Lite" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arcee-ai/Llama-3.1-SuperNova-Lite</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/arcee-ai__Llama-3.1-SuperNova-Lite-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | arcee-ai/Llama-3.1-SuperNova-Lite | 76246ca4448c1a11787daee0958b60ab27f17774 | 30.193464 | llama3 | 189 | 8.03 | true | false | false | true | 1.711987 | 0.801739 | 80.173938 | 0.515199 | 31.57234 | 0.182779 | 18.277946 | 0.306208 | 7.494407 | 0.416323 | 11.673698 | 0.387716 | 31.968454 | false | false | 2024-09-10 | 2024-09-17 | 2 | meta-llama/Meta-Llama-3.1-8B |
arcee-ai_Llama-Spark_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/arcee-ai/Llama-Spark" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arcee-ai/Llama-Spark</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/arcee-ai__Llama-Spark-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | arcee-ai/Llama-Spark | 6d74a617fbb17a1ada08528f2673c89f84fb062e | 27.037237 | llama3 | 27 | 8.03 | true | false | false | true | 1.661428 | 0.791073 | 79.107324 | 0.50535 | 29.770254 | 0.138973 | 13.897281 | 0.299497 | 6.599553 | 0.359333 | 2.616667 | 0.372091 | 30.232343 | false | false | 2024-07-26 | 2024-08-08 | 0 | arcee-ai/Llama-Spark |
arcee-ai_SuperNova-Medius_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/arcee-ai/SuperNova-Medius" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arcee-ai/SuperNova-Medius</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/arcee-ai__SuperNova-Medius-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | arcee-ai/SuperNova-Medius | e34fafcac2801be1ae5c7eb744e191a08119f2af | 39.154304 | apache-2.0 | 205 | 14.77 | true | false | false | true | 7.687531 | 0.718358 | 71.83584 | 0.637728 | 48.005015 | 0.469033 | 46.903323 | 0.333054 | 11.073826 | 0.423271 | 12.275521 | 0.503491 | 44.832299 | false | false | 2024-10-02 | 2024-10-22 | 1 | arcee-ai/SuperNova-Medius (Merge) |
arcee-ai_Virtuoso-Lite_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/arcee-ai/Virtuoso-Lite" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arcee-ai/Virtuoso-Lite</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/arcee-ai__Virtuoso-Lite-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | arcee-ai/Virtuoso-Lite | efc17a8dc63fa6f035c4dfe7be7d138aec837d03 | 36.416106 | other | 34 | 10.306 | true | false | false | true | 2.586022 | 0.809958 | 80.995758 | 0.609852 | 43.898557 | 0.253021 | 25.302115 | 0.34396 | 12.527964 | 0.459542 | 17.542708 | 0.444066 | 38.229536 | true | false | 2025-01-28 | 2025-01-28 | 1 | arcee-ai/Virtuoso-Lite (Merge) |
arcee-ai_Virtuoso-Small_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/arcee-ai/Virtuoso-Small" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arcee-ai/Virtuoso-Small</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/arcee-ai__Virtuoso-Small-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | arcee-ai/Virtuoso-Small | ca5dec1c6351ba6f2f0c59e609b94628a29c1459 | 40.536078 | apache-2.0 | 68 | 14.77 | true | false | false | true | 3.028628 | 0.793521 | 79.352119 | 0.651763 | 50.399846 | 0.409366 | 40.936556 | 0.336409 | 11.521253 | 0.433906 | 14.438281 | 0.519116 | 46.56841 | false | false | 2024-12-01 | 2024-12-03 | 1 | arcee-ai/Virtuoso-Small (Merge) |
arcee-ai_Virtuoso-Small-v2_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/arcee-ai/Virtuoso-Small-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arcee-ai/Virtuoso-Small-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/arcee-ai__Virtuoso-Small-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | arcee-ai/Virtuoso-Small-v2 | b1e0c424683cae4032aed31f43aa0cbda5255efb | 42.475702 | apache-2.0 | 34 | 14.766 | true | false | false | true | 3.114543 | 0.827318 | 82.731818 | 0.65541 | 50.947991 | 0.466012 | 46.601208 | 0.353188 | 13.758389 | 0.431333 | 14.283333 | 0.518783 | 46.531472 | false | false | 2025-01-30 | 2025-01-30 | 1 | arcee-ai/Virtuoso-Small-v2 (Merge) |
arcee-ai_raspberry-3B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/arcee-ai/raspberry-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arcee-ai/raspberry-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/arcee-ai__raspberry-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | arcee-ai/raspberry-3B | 66bf1346c060bbfe1f1b98cd22e7a26ada69cf70 | 15.852706 | other | 39 | 3.086 | true | false | false | true | 2.073053 | 0.315416 | 31.541643 | 0.426893 | 19.528234 | 0.103474 | 10.347432 | 0.277685 | 3.691275 | 0.412323 | 9.407031 | 0.285406 | 20.600621 | false | false | 2024-10-05 | 2024-10-07 | 1 | Qwen/Qwen2.5-3B |
argilla_notus-7b-v1_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/argilla/notus-7b-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">argilla/notus-7b-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/argilla__notus-7b-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | argilla/notus-7b-v1 | 30172203a2d41cb487bf7e2b92a821080783b2c9 | 18.474262 | mit | 122 | 7.242 | true | false | false | true | 1.335816 | 0.508207 | 50.820711 | 0.451186 | 22.747112 | 0.031722 | 3.172205 | 0.28943 | 5.257271 | 0.336417 | 6.585417 | 0.300366 | 22.262855 | false | true | 2023-11-16 | 2024-06-27 | 2 | mistralai/Mistral-7B-v0.1 |
argilla_notux-8x7b-v1_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/argilla/notux-8x7b-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">argilla/notux-8x7b-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/argilla__notux-8x7b-v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | argilla/notux-8x7b-v1 | 0b29f9afcbae2ab4c5085638d8f5a7f6d44c6b17 | 24.478584 | apache-2.0 | 165 | 46.703 | true | true | false | true | 30.709772 | 0.542229 | 54.222906 | 0.53633 | 34.758062 | 0.099698 | 9.969789 | 0.308725 | 7.829978 | 0.417594 | 10.532552 | 0.366024 | 29.558215 | false | true | 2023-12-12 | 2024-06-12 | 2 | mistralai/Mixtral-8x7B-v0.1 |
argilla-warehouse_Llama-3.1-8B-MagPie-Ultra_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/argilla-warehouse/Llama-3.1-8B-MagPie-Ultra" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">argilla-warehouse/Llama-3.1-8B-MagPie-Ultra</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/argilla-warehouse__Llama-3.1-8B-MagPie-Ultra-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | argilla-warehouse/Llama-3.1-8B-MagPie-Ultra | 1e12f20ca5db84f65a6db793a65100433aac0ac6 | 19.848991 | llama3.1 | 1 | 8.03 | true | false | false | true | 1.931354 | 0.575651 | 57.565149 | 0.461961 | 23.51631 | 0.077039 | 7.703927 | 0.266779 | 2.237136 | 0.35425 | 4.247917 | 0.314412 | 23.823508 | false | true | 2024-09-26 | 2024-09-30 | 1 | meta-llama/Llama-3.1-8B |
arisin_orca-platypus-13B-slerp_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/arisin/orca-platypus-13B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arisin/orca-platypus-13B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/arisin__orca-platypus-13B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | arisin/orca-platypus-13B-slerp | 679c8aa21e7d0ba79584a4b5eb352ecf26bd7096 | 14.791908 | apache-2.0 | 0 | 13.016 | true | false | false | false | 1.878127 | 0.267181 | 26.718108 | 0.463062 | 24.403766 | 0.015861 | 1.586103 | 0.298658 | 6.487696 | 0.425313 | 11.864063 | 0.259225 | 17.691711 | true | false | 2024-11-23 | 2024-11-23 | 0 | arisin/orca-platypus-13B-slerp |
arshiaafshani_Arsh-V1_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/arshiaafshani/Arsh-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">arshiaafshani/Arsh-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/arshiaafshani__Arsh-V1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | arshiaafshani/Arsh-V1 | ef9dd6e8aa5a4635ee66cf21fb58df6b3a27db7e | 37.544451 | 0 | 13.96 | false | false | false | false | 1.89867 | 0.604328 | 60.432763 | 0.673966 | 53.514272 | 0.262085 | 26.208459 | 0.373322 | 16.442953 | 0.489896 | 21.370312 | 0.525682 | 47.297946 | false | false | 2025-02-25 | 0 | Removed |
||
asharsha30_LLAMA_Harsha_8_B_ORDP_10k_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/asharsha30/LLAMA_Harsha_8_B_ORDP_10k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">asharsha30/LLAMA_Harsha_8_B_ORDP_10k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/asharsha30__LLAMA_Harsha_8_B_ORDP_10k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | asharsha30/LLAMA_Harsha_8_B_ORDP_10k | c9b04b40cd3915f0576659aafba86b85c22a7ee8 | 16.221361 | apache-2.0 | 0 | 8.03 | true | false | false | true | 1.482239 | 0.346391 | 34.639091 | 0.466871 | 25.725678 | 0.066465 | 6.646526 | 0.27349 | 3.131991 | 0.369656 | 7.073698 | 0.281001 | 20.111185 | false | false | 2024-12-01 | 2024-12-01 | 1 | asharsha30/LLAMA_Harsha_8_B_ORDP_10k (Merge) |
ashercn97_a1-v0.0.1_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/ashercn97/a1-v0.0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ashercn97/a1-v0.0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ashercn97__a1-v0.0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ashercn97/a1-v0.0.1 | 83760a8e55b312880f57247c0d9a4a25a0f2e528 | 21.574991 | 0 | 7.616 | false | false | false | false | 2.164093 | 0.219844 | 21.984446 | 0.518812 | 32.755432 | 0.214502 | 21.450151 | 0.311242 | 8.165548 | 0.411979 | 9.930729 | 0.416473 | 35.163638 | false | false | 2024-11-28 | 2024-11-28 | 0 | ashercn97/a1-v0.0.1 |
|
ashercn97_a1-v002_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/ashercn97/a1-v002" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ashercn97/a1-v002</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ashercn97__a1-v002-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ashercn97/a1-v002 | f6a33c4c83b57b3a76bc1e79e714bdf05f249ed6 | 22.881616 | apache-2.0 | 0 | 7.616 | true | false | false | false | 2.077839 | 0.258463 | 25.84631 | 0.526114 | 33.526527 | 0.234139 | 23.413897 | 0.318792 | 9.17226 | 0.415917 | 10.05625 | 0.41747 | 35.274453 | false | false | 2024-11-29 | 2024-11-29 | 2 | Qwen/Qwen2.5-7B |
assskelad_smollm2-360M-sft_SmallThoughts_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/assskelad/smollm2-360M-sft_SmallThoughts" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">assskelad/smollm2-360M-sft_SmallThoughts</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/assskelad__smollm2-360M-sft_SmallThoughts-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | assskelad/smollm2-360M-sft_SmallThoughts | 7b0eb51cc4c1d1d2dc234c6553a3c2859b6207e5 | 5.042467 | 1 | 0.362 | false | false | false | true | 0.416623 | 0.200711 | 20.071078 | 0.314957 | 4.164357 | 0.016616 | 1.661631 | 0.259228 | 1.230425 | 0.339521 | 1.106771 | 0.118185 | 2.020538 | false | false | 2025-03-10 | 2025-03-12 | 1 | HuggingFaceTB/SmolLM2-360M |
|
athirdpath_Llama-3.1-Instruct_NSFW-pretrained_e1-plus_reddit_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/athirdpath/Llama-3.1-Instruct_NSFW-pretrained_e1-plus_reddit" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">athirdpath/Llama-3.1-Instruct_NSFW-pretrained_e1-plus_reddit</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/athirdpath__Llama-3.1-Instruct_NSFW-pretrained_e1-plus_reddit-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | athirdpath/Llama-3.1-Instruct_NSFW-pretrained_e1-plus_reddit | 42eaee4de10302fec7c0c20ad96f527cfb0b10a3 | 20.968535 | apache-2.0 | 2 | 8.03 | true | false | false | false | 1.773055 | 0.452104 | 45.210375 | 0.493907 | 28.015909 | 0.101964 | 10.196375 | 0.291946 | 5.592841 | 0.386396 | 8.299479 | 0.356466 | 28.496232 | false | false | 2024-07-30 | 2024-08-01 | 1 | athirdpath/Llama-3.1-Instruct_NSFW-pretrained_e1 |
automerger_YamshadowExperiment28-7B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/automerger/YamshadowExperiment28-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">automerger/YamshadowExperiment28-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/automerger__YamshadowExperiment28-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | automerger/YamshadowExperiment28-7B | 76972ed8aacba1fd14f78e6f8d347f087f8b6800 | 19.88427 | apache-2.0 | 24 | 7.242 | true | false | false | false | 1.168848 | 0.407016 | 40.701561 | 0.515003 | 31.980235 | 0.061178 | 6.117825 | 0.286913 | 4.9217 | 0.430615 | 12.69349 | 0.306017 | 22.89081 | true | false | 2024-03-18 | 2024-06-29 | 1 | automerger/YamshadowExperiment28-7B (Merge) |
avemio_GRAG-NEMO-12B-ORPO-HESSIAN-AI_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/avemio/GRAG-NEMO-12B-ORPO-HESSIAN-AI" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">avemio/GRAG-NEMO-12B-ORPO-HESSIAN-AI</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/avemio__GRAG-NEMO-12B-ORPO-HESSIAN-AI-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | avemio/GRAG-NEMO-12B-ORPO-HESSIAN-AI | 0049b137a19fcc964d2e4864b851f24a753e401c | 0.737851 | apache-2.0 | 0 | 12.248 | true | false | false | true | 1.962405 | 0 | 0 | 0.26066 | 0.441064 | 0 | 0 | 0.259228 | 1.230425 | 0.344667 | 2.083333 | 0.106051 | 0.672281 | false | false | 2024-12-04 | 2025-01-15 | 1 | avemio/GRAG-NEMO-12B-ORPO-HESSIAN-AI (Merge) |
awnr_Mistral-7B-v0.1-signtensors-1-over-2_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/awnr/Mistral-7B-v0.1-signtensors-1-over-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">awnr/Mistral-7B-v0.1-signtensors-1-over-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/awnr__Mistral-7B-v0.1-signtensors-1-over-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | awnr/Mistral-7B-v0.1-signtensors-1-over-2 | 9575327242f8539eac59b6d788beccf54a6f9414 | 14.370487 | apache-2.0 | 2 | 7.242 | true | false | false | false | 3.213883 | 0.217922 | 21.792178 | 0.442288 | 22.400153 | 0.033988 | 3.398792 | 0.307047 | 7.606264 | 0.400604 | 8.808854 | 0.29995 | 22.216681 | false | false | 2024-06-27 | 2024-07-30 | 0 | awnr/Mistral-7B-v0.1-signtensors-1-over-2 |
awnr_Mistral-7B-v0.1-signtensors-1-over-4_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/awnr/Mistral-7B-v0.1-signtensors-1-over-4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">awnr/Mistral-7B-v0.1-signtensors-1-over-4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/awnr__Mistral-7B-v0.1-signtensors-1-over-4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | awnr/Mistral-7B-v0.1-signtensors-1-over-4 | b288ab9d8adfd2963a44a7935bb47649f55bcbee | 8.747198 | apache-2.0 | 1 | 7 | true | false | false | false | 1.932072 | 0.213301 | 21.330071 | 0.350709 | 9.227694 | 0.024924 | 2.492447 | 0.270134 | 2.684564 | 0.346031 | 2.18724 | 0.231051 | 14.56117 | false | false | 2024-07-29 | 2024-07-29 | 0 | awnr/Mistral-7B-v0.1-signtensors-1-over-4 |
awnr_Mistral-7B-v0.1-signtensors-3-over-8_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/awnr/Mistral-7B-v0.1-signtensors-3-over-8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">awnr/Mistral-7B-v0.1-signtensors-3-over-8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/awnr__Mistral-7B-v0.1-signtensors-3-over-8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | awnr/Mistral-7B-v0.1-signtensors-3-over-8 | fa368f705ace05da2fef25c030fe740cf1fef176 | 13.813469 | apache-2.0 | 1 | 7.242 | true | false | false | false | 1.905537 | 0.239429 | 23.942916 | 0.429994 | 20.435231 | 0.033233 | 3.323263 | 0.303691 | 7.158837 | 0.38175 | 5.785417 | 0.300116 | 22.235151 | false | false | 2024-07-29 | 2024-07-29 | 0 | awnr/Mistral-7B-v0.1-signtensors-3-over-8 |
awnr_Mistral-7B-v0.1-signtensors-5-over-16_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/awnr/Mistral-7B-v0.1-signtensors-5-over-16" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">awnr/Mistral-7B-v0.1-signtensors-5-over-16</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/awnr__Mistral-7B-v0.1-signtensors-5-over-16-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | awnr/Mistral-7B-v0.1-signtensors-5-over-16 | 5ea13b3d0723237889e1512bc70dae72f71884d1 | 12.284529 | apache-2.0 | 1 | 7.242 | true | false | false | false | 1.299469 | 0.211827 | 21.182684 | 0.412415 | 17.543031 | 0.029456 | 2.945619 | 0.28104 | 4.138702 | 0.368604 | 6.142188 | 0.295795 | 21.75495 | false | false | 2024-07-29 | 2024-07-29 | 0 | awnr/Mistral-7B-v0.1-signtensors-5-over-16 |
awnr_Mistral-7B-v0.1-signtensors-7-over-16_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/awnr/Mistral-7B-v0.1-signtensors-7-over-16" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">awnr/Mistral-7B-v0.1-signtensors-7-over-16</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/awnr__Mistral-7B-v0.1-signtensors-7-over-16-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | awnr/Mistral-7B-v0.1-signtensors-7-over-16 | 0e1f2cb0a81c38fc6c567d9c007883ab62fae266 | 14.246705 | apache-2.0 | 1 | 7.242 | true | false | false | false | 1.955622 | 0.229363 | 22.936254 | 0.431582 | 21.040437 | 0.03852 | 3.851964 | 0.303691 | 7.158837 | 0.395208 | 7.934375 | 0.303025 | 22.558363 | false | false | 2024-07-29 | 2024-07-29 | 0 | awnr/Mistral-7B-v0.1-signtensors-7-over-16 |
aws-prototyping_MegaBeam-Mistral-7B-512k_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/aws-prototyping/MegaBeam-Mistral-7B-512k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aws-prototyping/MegaBeam-Mistral-7B-512k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/aws-prototyping__MegaBeam-Mistral-7B-512k-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | aws-prototyping/MegaBeam-Mistral-7B-512k | 3e3b8c4b933650eed81ede7c4395df943d2a0796 | 17.582482 | apache-2.0 | 50 | 7.242 | true | false | false | true | 1.294376 | 0.597259 | 59.725861 | 0.366234 | 12.361178 | 0.028701 | 2.870091 | 0.282718 | 4.362416 | 0.399365 | 8.520573 | 0.258893 | 17.654772 | false | false | 2024-07-30 | 2024-10-07 | 0 | aws-prototyping/MegaBeam-Mistral-7B-512k |
axolotl-ai-co_romulus-mistral-nemo-12b-simpo_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/axolotl-ai-co/romulus-mistral-nemo-12b-simpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">axolotl-ai-co/romulus-mistral-nemo-12b-simpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/axolotl-ai-co__romulus-mistral-nemo-12b-simpo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | axolotl-ai-co/romulus-mistral-nemo-12b-simpo | 15fd3ffa46c1ea51aa5d26a1da24214e324d7cf2 | 25.176086 | apache-2.0 | 17 | 12.248 | true | false | false | true | 4.080013 | 0.607925 | 60.792475 | 0.539506 | 34.642401 | 0.114048 | 11.404834 | 0.278523 | 3.803132 | 0.423302 | 12.979427 | 0.346908 | 27.434249 | false | false | 2024-07-24 | 2024-09-21 | 1 | Removed |
baconnier_Napoleon_24B_V0.0_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/baconnier/Napoleon_24B_V0.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">baconnier/Napoleon_24B_V0.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/baconnier__Napoleon_24B_V0.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | baconnier/Napoleon_24B_V0.0 | e05877c4b79fdcb68488b99613a9cf6abfd6f7be | 27.300836 | 0 | 23.572 | false | false | false | false | 1.276656 | 0.180102 | 18.010213 | 0.636711 | 47.264972 | 0.227341 | 22.734139 | 0.379195 | 17.225951 | 0.44199 | 13.682031 | 0.503989 | 44.887707 | false | false | 2025-02-16 | 2025-03-03 | 0 | baconnier/Napoleon_24B_V0.0 |
|
baconnier_Napoleon_24B_V0.2_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/baconnier/Napoleon_24B_V0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">baconnier/Napoleon_24B_V0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/baconnier__Napoleon_24B_V0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | baconnier/Napoleon_24B_V0.2 | 0a60bbf6fc74ceabf69416b3c5268801a61a34ff | 24.007932 | 0 | 23.572 | false | false | false | false | 1.288848 | 0.252717 | 25.271723 | 0.591062 | 41.205487 | 0.143505 | 14.350453 | 0.338087 | 11.744966 | 0.445958 | 14.178125 | 0.435672 | 37.296838 | false | false | 2025-02-17 | 2025-02-17 | 1 | baconnier/Napoleon_24B_V0.2 (Merge) |
|
baebee_7B-Cetacea_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/baebee/7B-Cetacea" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">baebee/7B-Cetacea</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/baebee__7B-Cetacea-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | baebee/7B-Cetacea | fc2f943e55f7edb0d5e165387d69954456f2c4fb | 19.942171 | 0 | 7.242 | false | false | false | true | 0.4542 | 0.527866 | 52.786606 | 0.475717 | 25.75266 | 0.046828 | 4.682779 | 0.286074 | 4.809843 | 0.413625 | 9.903125 | 0.295462 | 21.718011 | false | false | 2025-03-10 | 2025-03-10 | 1 | baebee/7B-Cetacea (Merge) |
|
baebee_mergekit-model_stock-nzjnheg_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/baebee/mergekit-model_stock-nzjnheg" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">baebee/mergekit-model_stock-nzjnheg</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/baebee__mergekit-model_stock-nzjnheg-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | baebee/mergekit-model_stock-nzjnheg | c6fd83921bfb927e6d49fb66d0ca524fddccba01 | 22.997985 | 0 | 7.616 | false | false | false | true | 0.633483 | 0.484427 | 48.442688 | 0.528739 | 32.742095 | 0.167674 | 16.767372 | 0.280201 | 4.026846 | 0.384667 | 6.016667 | 0.36993 | 29.992243 | false | false | 2025-02-26 | 2025-02-27 | 1 | baebee/mergekit-model_stock-nzjnheg (Merge) |
|
baebee_mergekit-ties-fnjenli_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/baebee/mergekit-ties-fnjenli" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">baebee/mergekit-ties-fnjenli</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/baebee__mergekit-ties-fnjenli-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | baebee/mergekit-ties-fnjenli | d4310b29dab10c750e141f9e440d55e0e3b3814a | 5.467473 | 1 | 7.616 | false | false | false | false | 0.6872 | 0.198812 | 19.881248 | 0.30237 | 2.925295 | 0.002266 | 0.226586 | 0.244966 | 0 | 0.401938 | 8.342188 | 0.112866 | 1.429521 | false | false | 2025-02-24 | 2025-02-24 | 1 | baebee/mergekit-ties-fnjenli (Merge) |
|
bamec66557_MISCHIEVOUS-12B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/bamec66557/MISCHIEVOUS-12B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bamec66557/MISCHIEVOUS-12B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bamec66557__MISCHIEVOUS-12B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | bamec66557/MISCHIEVOUS-12B | da223cb99a6bc23c2ef124fee2ed2581cd5a880a | 22.619278 | apache-2.0 | 3 | 12.248 | true | false | false | false | 2.230188 | 0.385184 | 38.518354 | 0.540498 | 34.071627 | 0.127644 | 12.76435 | 0.32047 | 9.395973 | 0.41449 | 11.277865 | 0.367188 | 29.6875 | true | false | 2024-12-13 | 2024-12-18 | 1 | bamec66557/MISCHIEVOUS-12B (Merge) |
bamec66557_MISCHIEVOUS-12B-Mix_0.1v_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/bamec66557/MISCHIEVOUS-12B-Mix_0.1v" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bamec66557/MISCHIEVOUS-12B-Mix_0.1v</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bamec66557__MISCHIEVOUS-12B-Mix_0.1v-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | bamec66557/MISCHIEVOUS-12B-Mix_0.1v | a7d01046742cbccace2ee5343b434313fa47e5e1 | 22.611076 | apache-2.0 | 2 | 12.248 | true | false | false | false | 2.262394 | 0.363626 | 36.362629 | 0.543602 | 34.357592 | 0.132931 | 13.293051 | 0.32802 | 10.402685 | 0.413156 | 11.544531 | 0.367354 | 29.705969 | true | false | 2024-12-14 | 2024-12-18 | 1 | bamec66557/MISCHIEVOUS-12B-Mix_0.1v (Merge) |
bamec66557_MISCHIEVOUS-12B-Mix_0.2v_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/bamec66557/MISCHIEVOUS-12B-Mix_0.2v" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bamec66557/MISCHIEVOUS-12B-Mix_0.2v</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bamec66557__MISCHIEVOUS-12B-Mix_0.2v-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | bamec66557/MISCHIEVOUS-12B-Mix_0.2v | f93c9997bc22abb5c97a895083c20b84e957d172 | 22.454257 | apache-2.0 | 1 | 12.248 | true | false | false | false | 2.265824 | 0.362377 | 36.237738 | 0.543436 | 34.410277 | 0.126133 | 12.613293 | 0.325503 | 10.067114 | 0.415823 | 11.811198 | 0.366273 | 29.585919 | true | false | 2024-12-14 | 2024-12-18 | 1 | bamec66557/MISCHIEVOUS-12B-Mix_0.2v (Merge) |
bamec66557_MISCHIEVOUS-12B-Mix_0.3v_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/bamec66557/MISCHIEVOUS-12B-Mix_0.3v" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bamec66557/MISCHIEVOUS-12B-Mix_0.3v</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bamec66557__MISCHIEVOUS-12B-Mix_0.3v-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | bamec66557/MISCHIEVOUS-12B-Mix_0.3v | e4e3ffddc0ff26b9ee915f28dacd0bc298333ed2 | 22.795688 | apache-2.0 | 1 | 12.248 | true | false | false | false | 2.259516 | 0.386982 | 38.69821 | 0.543139 | 34.387443 | 0.133686 | 13.36858 | 0.319631 | 9.284116 | 0.413125 | 11.440625 | 0.366356 | 29.595154 | true | false | 2024-12-15 | 2024-12-18 | 1 | bamec66557/MISCHIEVOUS-12B-Mix_0.3v (Merge) |
bamec66557_MISCHIEVOUS-12B-Mix_0.4v_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/bamec66557/MISCHIEVOUS-12B-Mix_0.4v" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bamec66557/MISCHIEVOUS-12B-Mix_0.4v</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bamec66557__MISCHIEVOUS-12B-Mix_0.4v-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | bamec66557/MISCHIEVOUS-12B-Mix_0.4v | fd2f202fd2114aed91fe7009588ccdc3eefc40a2 | 26.653906 | apache-2.0 | 2 | 12.248 | true | false | false | true | 2.108187 | 0.650814 | 65.081428 | 0.509424 | 30.596485 | 0.135196 | 13.519637 | 0.317114 | 8.948546 | 0.417625 | 11.969792 | 0.368268 | 29.80755 | true | false | 2024-12-16 | 2024-12-16 | 1 | bamec66557/MISCHIEVOUS-12B-Mix_0.4v (Merge) |
bamec66557_MISCHIEVOUS-12B-Mix_0.5v_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/bamec66557/MISCHIEVOUS-12B-Mix_0.5v" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bamec66557/MISCHIEVOUS-12B-Mix_0.5v</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bamec66557__MISCHIEVOUS-12B-Mix_0.5v-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | bamec66557/MISCHIEVOUS-12B-Mix_0.5v | e9284d6aa1868026ef1687160858b94f16e164c1 | 22.635397 | apache-2.0 | 1 | 12.248 | true | false | false | false | 2.308956 | 0.374567 | 37.456726 | 0.542193 | 34.177004 | 0.136707 | 13.670695 | 0.32047 | 9.395973 | 0.413156 | 11.544531 | 0.366107 | 29.56745 | true | false | 2024-12-16 | 2024-12-16 | 1 | bamec66557/MISCHIEVOUS-12B-Mix_0.5v (Merge) |
bamec66557_MISCHIEVOUS-12B-Mix_0.6v_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/bamec66557/MISCHIEVOUS-12B-Mix_0.6v" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bamec66557/MISCHIEVOUS-12B-Mix_0.6v</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bamec66557__MISCHIEVOUS-12B-Mix_0.6v-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | bamec66557/MISCHIEVOUS-12B-Mix_0.6v | b089a77c506215a02abb495c912526b7dfd0bc8e | 23.874345 | apache-2.0 | 1 | 12.248 | true | false | false | false | 2.130705 | 0.436566 | 43.656609 | 0.544891 | 34.727794 | 0.125378 | 12.537764 | 0.32802 | 10.402685 | 0.41849 | 12.344531 | 0.36619 | 29.576684 | true | false | 2024-12-19 | 2024-12-19 | 1 | bamec66557/MISCHIEVOUS-12B-Mix_0.6v (Merge) |
bamec66557_MISCHIEVOUS-12B-Mix_III_IV_V_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/bamec66557/MISCHIEVOUS-12B-Mix_III_IV_V" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bamec66557/MISCHIEVOUS-12B-Mix_III_IV_V</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bamec66557__MISCHIEVOUS-12B-Mix_III_IV_V-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | bamec66557/MISCHIEVOUS-12B-Mix_III_IV_V | 651323f625a3b0f982bd0ca294b7293327de19ad | 23.23673 | mit | 1 | 12.248 | true | false | false | false | 2.188831 | 0.403094 | 40.309379 | 0.546453 | 34.850701 | 0.129154 | 12.915408 | 0.32047 | 9.395973 | 0.419823 | 12.344531 | 0.366439 | 29.604388 | true | false | 2024-12-19 | 2024-12-19 | 1 | bamec66557/MISCHIEVOUS-12B-Mix_III_IV_V (Merge) |
bamec66557_MISCHIEVOUS-12B-Mix_III_ex_V_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/bamec66557/MISCHIEVOUS-12B-Mix_III_ex_V" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bamec66557/MISCHIEVOUS-12B-Mix_III_ex_V</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bamec66557__MISCHIEVOUS-12B-Mix_III_ex_V-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | bamec66557/MISCHIEVOUS-12B-Mix_III_ex_V | 3af16a4f7be269c0483ba7ff3c7ea70c5843a44d | 23.807842 | apache-2.0 | 0 | 12.248 | true | false | false | false | 2.159005 | 0.43162 | 43.162032 | 0.544893 | 34.868635 | 0.132175 | 13.217523 | 0.32047 | 9.395973 | 0.419792 | 12.773958 | 0.36486 | 29.42893 | true | false | 2024-12-17 | 2024-12-18 | 1 | bamec66557/MISCHIEVOUS-12B-Mix_III_ex_V (Merge) |
bamec66557_MISCHIEVOUS-12B-Mix_Neo_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/bamec66557/MISCHIEVOUS-12B-Mix_Neo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bamec66557/MISCHIEVOUS-12B-Mix_Neo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bamec66557__MISCHIEVOUS-12B-Mix_Neo-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | bamec66557/MISCHIEVOUS-12B-Mix_Neo | a3f902a40e0e6a1b7abdefe70e8cd14929deddc9 | 26.07767 | apache-2.0 | 1 | 12.248 | true | false | false | true | 2.0109 | 0.624961 | 62.496066 | 0.507757 | 30.36069 | 0.132931 | 13.293051 | 0.316275 | 8.836689 | 0.415021 | 11.644271 | 0.368517 | 29.835254 | true | false | 2024-12-19 | 2024-12-19 | 1 | bamec66557/MISCHIEVOUS-12B-Mix_Neo (Merge) |
bamec66557_Mistral-Nemo-VICIOUS_MESH-12B-2407_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/bamec66557/Mistral-Nemo-VICIOUS_MESH-12B-2407" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bamec66557/Mistral-Nemo-VICIOUS_MESH-12B-2407</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bamec66557__Mistral-Nemo-VICIOUS_MESH-12B-2407-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | bamec66557/Mistral-Nemo-VICIOUS_MESH-12B-2407 | 5e11cfae129186b2479fb01211d565a16eed1f02 | 27.482117 | apache-2.0 | 2 | 12.248 | true | false | false | true | 2.938804 | 0.670573 | 67.057297 | 0.515596 | 31.356607 | 0.136707 | 13.670695 | 0.315436 | 8.724832 | 0.43099 | 14.340365 | 0.367686 | 29.742908 | true | false | 2024-12-22 | 2024-12-22 | 1 | bamec66557/Mistral-Nemo-VICIOUS_MESH-12B-2407 (Merge) |
bamec66557_NameLess-12B-prob_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/bamec66557/NameLess-12B-prob" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bamec66557/NameLess-12B-prob</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/bamec66557__NameLess-12B-prob-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | bamec66557/NameLess-12B-prob | eb53d01a5d573356c16b6235679e84567f599e33 | 27.189049 | apache-2.0 | 0 | 12.248 | true | false | false | true | 1.922933 | 0.660232 | 66.023152 | 0.515814 | 31.355728 | 0.126133 | 12.613293 | 0.314597 | 8.612975 | 0.433625 | 14.703125 | 0.368434 | 29.82602 | true | false | 2024-12-25 | 2024-12-25 | 0 | bamec66557/NameLess-12B-prob |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.