eval_name
stringlengths 12
111
| Precision
stringclasses 3
values | Type
stringclasses 7
values | T
stringclasses 7
values | Weight type
stringclasses 2
values | Architecture
stringclasses 64
values | Model
stringlengths 355
689
| fullname
stringlengths 4
102
| Model sha
stringlengths 0
40
| Average ⬆️
float64 0.74
52.1
| Hub License
stringclasses 27
values | Hub ❤️
int64 0
6.09k
| #Params (B)
float64 -1
141
| Available on the hub
bool 2
classes | MoE
bool 2
classes | Flagged
bool 2
classes | Chat Template
bool 2
classes | CO₂ cost (kg)
float64 0.04
187
| IFEval Raw
float64 0
0.9
| IFEval
float64 0
90
| BBH Raw
float64 0.22
0.83
| BBH
float64 0.25
76.7
| MATH Lvl 5 Raw
float64 0
0.71
| MATH Lvl 5
float64 0
71.5
| GPQA Raw
float64 0.21
0.47
| GPQA
float64 0
29.4
| MUSR Raw
float64 0.29
0.6
| MUSR
float64 0
38.7
| MMLU-PRO Raw
float64 0.1
0.73
| MMLU-PRO
float64 0
70
| Merged
bool 2
classes | Official Providers
bool 2
classes | Upload To Hub Date
stringclasses 525
values | Submission Date
stringclasses 263
values | Generation
int64 0
10
| Base Model
stringlengths 4
102
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
ahmeda335_13_outOf_32_pruned_layers_llama3.1-8b_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/ahmeda335/13_outOf_32_pruned_layers_llama3.1-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ahmeda335/13_outOf_32_pruned_layers_llama3.1-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ahmeda335__13_outOf_32_pruned_layers_llama3.1-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ahmeda335/13_outOf_32_pruned_layers_llama3.1-8b | 248c420cc0a0bb8fce3a64a998ca0ce89613783c | 4.404259 | apache-2.0 | 0 | 5.195 | true | false | false | true | 0.992047 | 0.174807 | 17.480729 | 0.288326 | 1.677845 | 0 | 0 | 0.259228 | 1.230425 | 0.380323 | 4.607031 | 0.112866 | 1.429521 | false | false | 2024-10-21 | 2024-12-03 | 1 | ahmeda335/13_outOf_32_pruned_layers_llama3.1-8b (Merge) |
ai21labs_Jamba-v0.1_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | JambaForCausalLM | <a target="_blank" href="https://huggingface.co/ai21labs/Jamba-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ai21labs/Jamba-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ai21labs__Jamba-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ai21labs/Jamba-v0.1 | ce13f3fe99555a2606d1892665bb67649032ff2d | 9.218365 | apache-2.0 | 1,181 | 51.57 | true | true | false | true | 15.315295 | 0.202559 | 20.255921 | 0.360226 | 10.722059 | 0.015861 | 1.586103 | 0.268456 | 2.46085 | 0.359021 | 3.710937 | 0.249169 | 16.57432 | false | true | 2024-03-28 | 2024-09-16 | 0 | ai21labs/Jamba-v0.1 |
ai4bharat_Airavata_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/ai4bharat/Airavata" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ai4bharat/Airavata</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/ai4bharat__Airavata-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | ai4bharat/Airavata | 3af92e691e461d80d080823f48996df10aa8ec19 | 5.550973 | llama2 | 34 | 6.87 | true | false | false | false | 1.070554 | 0.055854 | 5.585402 | 0.362769 | 11.574029 | 0.018127 | 1.812689 | 0.274329 | 3.243848 | 0.376292 | 4.036458 | 0.163481 | 7.053413 | false | false | 2024-01-13 | 2025-02-06 | 0 | ai4bharat/Airavata |
aixonlab_Aether-12b_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/aixonlab/Aether-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aixonlab/Aether-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/aixonlab__Aether-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | aixonlab/Aether-12b | c55d08a69c74f87c18ab5afb05d46359f389c91a | 18.045943 | apache-2.0 | 1 | 12.248 | true | false | false | false | 3.732864 | 0.234683 | 23.468286 | 0.51794 | 30.551138 | 0.106495 | 10.649547 | 0.316275 | 8.836689 | 0.382865 | 7.991406 | 0.341007 | 26.77859 | false | false | 2024-09-24 | 2024-10-09 | 1 | Xclbr7/Arcanum-12b |
aixonlab_Grey-12b_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/aixonlab/Grey-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aixonlab/Grey-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/aixonlab__Grey-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | aixonlab/Grey-12b | 50f56572870c49186c3679f9949a602d2d97c046 | 23.681553 | apache-2.0 | 1 | 12.248 | true | false | false | false | 2.937387 | 0.396799 | 39.679938 | 0.569896 | 38.746043 | 0.098187 | 9.818731 | 0.300336 | 6.711409 | 0.451635 | 16.254427 | 0.377909 | 30.878768 | false | false | 2024-10-07 | 2024-10-09 | 2 | Xclbr7/Arcanum-12b |
aixonlab_Zara-14b-v1.2_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/aixonlab/Zara-14b-v1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">aixonlab/Zara-14b-v1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/aixonlab__Zara-14b-v1.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | aixonlab/Zara-14b-v1.2 | 88794cba01ae33c28cd4e698d7663c3e80f3c0ae | 37.998913 | apache-2.0 | 3 | 14.766 | true | false | false | false | 1.940744 | 0.61974 | 61.974007 | 0.640537 | 48.270463 | 0.353474 | 35.347432 | 0.381711 | 17.561521 | 0.467479 | 17.468229 | 0.526346 | 47.371823 | false | false | 2025-01-29 | 2025-02-24 | 3 | sometimesanotion/Lamarck-14B-v0.7 (Merge) |
akhadangi_Llama3.2.1B.0.01-First_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/akhadangi/Llama3.2.1B.0.01-First" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">akhadangi/Llama3.2.1B.0.01-First</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/akhadangi__Llama3.2.1B.0.01-First-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | akhadangi/Llama3.2.1B.0.01-First | b9d1edeb95f15c92118f5b4677c2f97ca5523a3d | 3.109892 | llama3.2 | 0 | 1.236 | true | false | false | false | 0.370919 | 0.081359 | 8.135857 | 0.318919 | 4.766231 | 0.018127 | 1.812689 | 0.248322 | 0 | 0.319396 | 1.757812 | 0.119681 | 2.186761 | false | false | 2025-03-10 | 2025-03-10 | 1 | akhadangi/Llama3.2.1B.0.01-First (Merge) |
akhadangi_Llama3.2.1B.0.01-Last_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/akhadangi/Llama3.2.1B.0.01-Last" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">akhadangi/Llama3.2.1B.0.01-Last</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/akhadangi__Llama3.2.1B.0.01-Last-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | akhadangi/Llama3.2.1B.0.01-Last | 30ff8e568c712c27f8c4990a988115817ef604fb | 3.26213 | llama3.2 | 0 | 1.236 | true | false | false | false | 0.367426 | 0.09165 | 9.165015 | 0.315928 | 4.282945 | 0.013595 | 1.359517 | 0.243289 | 0 | 0.320635 | 2.246094 | 0.122673 | 2.519208 | false | false | 2025-03-10 | 2025-03-10 | 1 | akhadangi/Llama3.2.1B.0.01-Last (Merge) |
akhadangi_Llama3.2.1B.0.1-First_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/akhadangi/Llama3.2.1B.0.1-First" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">akhadangi/Llama3.2.1B.0.1-First</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/akhadangi__Llama3.2.1B.0.1-First-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | akhadangi/Llama3.2.1B.0.1-First | 0b986d5ce44bb9c0f3f3fd31442353d590b37700 | 3.269241 | llama3.2 | 0 | 1.236 | true | false | false | false | 0.367464 | 0.100093 | 10.009331 | 0.311962 | 4.177 | 0.021148 | 2.114804 | 0.244966 | 0 | 0.330125 | 1.432292 | 0.116938 | 1.882018 | false | false | 2025-03-10 | 2025-03-10 | 1 | akhadangi/Llama3.2.1B.0.1-First (Merge) |
akhadangi_Llama3.2.1B.0.1-Last_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Adapter | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/akhadangi/Llama3.2.1B.0.1-Last" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">akhadangi/Llama3.2.1B.0.1-Last</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/akhadangi__Llama3.2.1B.0.1-Last-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | akhadangi/Llama3.2.1B.0.1-Last | 72a0652d9432dc62a296437a7880c0a0cb267097 | 3.266722 | llama3.2 | 0 | 1.236 | true | false | false | false | 0.372928 | 0.094972 | 9.497245 | 0.316378 | 4.256106 | 0.021148 | 2.114804 | 0.238255 | 0 | 0.334063 | 1.757812 | 0.117769 | 1.974365 | false | false | 2025-03-10 | 2025-03-10 | 1 | akhadangi/Llama3.2.1B.0.1-Last (Merge) |
akhadangi_Llama3.2.1B.BaseFiT_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/akhadangi/Llama3.2.1B.BaseFiT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">akhadangi/Llama3.2.1B.BaseFiT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/akhadangi__Llama3.2.1B.BaseFiT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | akhadangi/Llama3.2.1B.BaseFiT | 7e62ad1fd646caf483cfc39d83ab50bb295b8060 | 3.318003 | llama3.2 | 0 | 1.236 | true | false | false | false | 0.372435 | 0.088278 | 8.827799 | 0.317452 | 4.548336 | 0.024169 | 2.416918 | 0.253356 | 0.447427 | 0.322063 | 1.757812 | 0.117188 | 1.909722 | false | false | 2025-03-10 | 2025-03-10 | 1 | akhadangi/Llama3.2.1B.BaseFiT (Merge) |
akjindal53244_Llama-3.1-Storm-8B_bfloat16 | bfloat16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/akjindal53244/Llama-3.1-Storm-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">akjindal53244/Llama-3.1-Storm-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/akjindal53244__Llama-3.1-Storm-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | akjindal53244/Llama-3.1-Storm-8B | df21b06dcf534b026dd301a44a521d7253c8b94b | 29.36525 | llama3.1 | 175 | 8.03 | true | false | false | true | 0.794391 | 0.803263 | 80.326312 | 0.519633 | 31.615695 | 0.162387 | 16.238671 | 0.309564 | 7.941834 | 0.402833 | 8.820833 | 0.381233 | 31.248153 | true | false | 2024-08-12 | 2024-10-27 | 0 | akjindal53244/Llama-3.1-Storm-8B |
akjindal53244_Llama-3.1-Storm-8B_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/akjindal53244/Llama-3.1-Storm-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">akjindal53244/Llama-3.1-Storm-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/akjindal53244__Llama-3.1-Storm-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | akjindal53244/Llama-3.1-Storm-8B | df21b06dcf534b026dd301a44a521d7253c8b94b | 29.943924 | llama3.1 | 175 | 8.03 | true | false | false | true | 1.941915 | 0.805062 | 80.506168 | 0.518867 | 31.494363 | 0.172205 | 17.220544 | 0.326342 | 10.178971 | 0.402802 | 9.116927 | 0.380319 | 31.146572 | true | false | 2024-08-12 | 2024-11-26 | 0 | akjindal53244/Llama-3.1-Storm-8B |
alcholjung_llama3_medical_tuned_float16 | float16 | 🔶 fine-tuned on domain-specific datasets | 🔶 | Adapter | ? | <a target="_blank" href="https://huggingface.co/alcholjung/llama3_medical_tuned" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">alcholjung/llama3_medical_tuned</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/alcholjung__llama3_medical_tuned-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | alcholjung/llama3_medical_tuned | 62bd457b6fe961a42a631306577e622c83876cb6 | 12.04877 | 0 | 16.061 | false | false | false | false | 1.821442 | 0.010566 | 1.056641 | 0.451294 | 23.265089 | 0.046828 | 4.682779 | 0.286074 | 4.809843 | 0.466021 | 16.852604 | 0.294631 | 21.625665 | false | false | 2024-08-14 | 2024-08-14 | 0 | alcholjung/llama3_medical_tuned |
|
allenai_Llama-3.1-Tulu-3-70B_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allenai/Llama-3.1-Tulu-3-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/Llama-3.1-Tulu-3-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__Llama-3.1-Tulu-3-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allenai/Llama-3.1-Tulu-3-70B | c4280450c0cd91a2fb6f41a25c6a1662c6966b01 | 42.331787 | llama3.1 | 54 | 70.554 | true | false | false | true | 73.186101 | 0.829117 | 82.911674 | 0.616363 | 45.365569 | 0.450151 | 45.015106 | 0.373322 | 16.442953 | 0.494833 | 23.754167 | 0.464511 | 40.501256 | false | true | 2024-11-20 | 2024-11-27 | 1 | allenai/Llama-3.1-Tulu-3-70B (Merge) |
allenai_Llama-3.1-Tulu-3-70B_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allenai/Llama-3.1-Tulu-3-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/Llama-3.1-Tulu-3-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__Llama-3.1-Tulu-3-70B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allenai/Llama-3.1-Tulu-3-70B | c4280450c0cd91a2fb6f41a25c6a1662c6966b01 | 41.454527 | llama3.1 | 54 | 70.554 | true | false | false | true | 38.022026 | 0.837934 | 83.793446 | 0.615685 | 45.259481 | 0.382931 | 38.293051 | 0.373322 | 16.442953 | 0.498802 | 24.316927 | 0.465592 | 40.621306 | false | true | 2024-11-20 | 2024-11-27 | 1 | allenai/Llama-3.1-Tulu-3-70B (Merge) |
allenai_Llama-3.1-Tulu-3-70B-DPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allenai/Llama-3.1-Tulu-3-70B-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/Llama-3.1-Tulu-3-70B-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__Llama-3.1-Tulu-3-70B-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allenai/Llama-3.1-Tulu-3-70B-DPO | 6ea110f39fb660573111892a1381d3be3f826f80 | 42.224415 | llama3.1 | 9 | 70 | true | false | false | true | 73.601498 | 0.828193 | 82.819253 | 0.61462 | 45.047181 | 0.449396 | 44.939577 | 0.375839 | 16.778523 | 0.49226 | 23.399219 | 0.463265 | 40.362736 | false | true | 2024-11-20 | 2024-11-27 | 1 | allenai/Llama-3.1-Tulu-3-70B-DPO (Merge) |
allenai_Llama-3.1-Tulu-3-70B-SFT_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allenai/Llama-3.1-Tulu-3-70B-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/Llama-3.1-Tulu-3-70B-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__Llama-3.1-Tulu-3-70B-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allenai/Llama-3.1-Tulu-3-70B-SFT | f58ab66db3a1c5dd805c6d3420b2b4f5aef30041 | 38.848492 | llama3.1 | 6 | 70.554 | true | false | false | true | 54.676654 | 0.805062 | 80.506168 | 0.595144 | 42.023984 | 0.331571 | 33.1571 | 0.344799 | 12.639821 | 0.502615 | 24.49349 | 0.462434 | 40.27039 | false | true | 2024-11-18 | 2024-11-27 | 1 | allenai/Llama-3.1-Tulu-3-70B-SFT (Merge) |
allenai_Llama-3.1-Tulu-3-8B_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allenai/Llama-3.1-Tulu-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/Llama-3.1-Tulu-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__Llama-3.1-Tulu-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allenai/Llama-3.1-Tulu-3-8B | 63b75e0dd6eac3725319f869716b9b70c16a6a65 | 26.034998 | llama3.1 | 161 | 8.03 | true | false | false | true | 0.703774 | 0.826669 | 82.666879 | 0.404983 | 16.671813 | 0.196375 | 19.637462 | 0.298658 | 6.487696 | 0.417469 | 10.45026 | 0.282663 | 20.295878 | false | true | 2024-11-20 | 2024-11-21 | 1 | allenai/Llama-3.1-Tulu-3-8B (Merge) |
allenai_Llama-3.1-Tulu-3-8B_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allenai/Llama-3.1-Tulu-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/Llama-3.1-Tulu-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__Llama-3.1-Tulu-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allenai/Llama-3.1-Tulu-3-8B | 50fef8756a9a4ca2010587d128aebb3a18ec897d | 26.260868 | llama3.1 | 161 | 8.03 | true | false | false | true | 1.402464 | 0.82547 | 82.546975 | 0.406083 | 16.858052 | 0.21148 | 21.148036 | 0.29698 | 6.263982 | 0.417469 | 10.516927 | 0.282081 | 20.231235 | false | true | 2024-11-20 | 2024-11-28 | 1 | allenai/Llama-3.1-Tulu-3-8B (Merge) |
allenai_Llama-3.1-Tulu-3-8B-DPO_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allenai/Llama-3.1-Tulu-3-8B-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/Llama-3.1-Tulu-3-8B-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__Llama-3.1-Tulu-3-8B-DPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allenai/Llama-3.1-Tulu-3-8B-DPO | 002347006131d85678ea3865520bc9caad69869a | 26.46398 | llama3.1 | 24 | 8 | true | false | false | true | 1.340675 | 0.802938 | 80.293843 | 0.407943 | 17.426016 | 0.236405 | 23.640483 | 0.293624 | 5.816555 | 0.416135 | 10.516927 | 0.289811 | 21.090056 | false | true | 2024-11-20 | 2024-11-22 | 1 | allenai/Llama-3.1-Tulu-3-8B-DPO (Merge) |
allenai_Llama-3.1-Tulu-3-8B-RM_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForSequenceClassification | <a target="_blank" href="https://huggingface.co/allenai/Llama-3.1-Tulu-3-8B-RM" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/Llama-3.1-Tulu-3-8B-RM</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__Llama-3.1-Tulu-3-8B-RM-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allenai/Llama-3.1-Tulu-3-8B-RM | 76247c00745747f820f1712949b5b37901d0f9c4 | 4.235057 | llama3.1 | 17 | 8 | true | false | false | true | 1.473798 | 0.167014 | 16.701352 | 0.295004 | 2.64967 | 0 | 0 | 0.256711 | 0.894855 | 0.376417 | 4.252083 | 0.108211 | 0.912382 | false | true | 2024-11-20 | 2024-11-22 | 1 | allenai/Llama-3.1-Tulu-3-8B-RM (Merge) |
allenai_Llama-3.1-Tulu-3-8B-SFT_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allenai/Llama-3.1-Tulu-3-8B-SFT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/Llama-3.1-Tulu-3-8B-SFT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__Llama-3.1-Tulu-3-8B-SFT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allenai/Llama-3.1-Tulu-3-8B-SFT | 4ddd761e6750e04ea3d468175f78463628bba860 | 22.596941 | llama3.1 | 29 | 8.03 | true | false | false | true | 1.366493 | 0.74034 | 74.034008 | 0.387186 | 13.931208 | 0.117825 | 11.782477 | 0.277685 | 3.691275 | 0.426771 | 12.013021 | 0.281167 | 20.129654 | false | true | 2024-11-18 | 2024-11-22 | 1 | allenai/Llama-3.1-Tulu-3-8B-SFT (Merge) |
allenai_OLMo-1.7-7B-hf_float16 | float16 | ❓ other | ❓ | Original | Unknown | <a target="_blank" href="https://huggingface.co/allenai/OLMo-1.7-7B-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/OLMo-1.7-7B-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__OLMo-1.7-7B-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allenai/OLMo-1.7-7B-hf | a2a514275cb69a5f9b3dd51e0a4e92df88a12dfb | 3.800232 | apache-2.0 | 12 | 0 | true | false | false | false | 0.654293 | 0.156897 | 15.689703 | 0.30137 | 2.770316 | 0.002266 | 0.226586 | 0.255034 | 0.671141 | 0.34749 | 2.069531 | 0.112367 | 1.374113 | false | true | 2024-04-17 | 0 | allenai/OLMo-1.7-7B-hf |
|
allenai_OLMo-1B-hf_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | OlmoForCausalLM | <a target="_blank" href="https://huggingface.co/allenai/OLMo-1B-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/OLMo-1B-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__OLMo-1B-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allenai/OLMo-1B-hf | 8e995430edd24416ccfa98b5b283fa07b0c9f1a9 | 6.633924 | apache-2.0 | 20 | 1.177 | true | false | false | false | 0.497747 | 0.218197 | 21.819661 | 0.305195 | 3.196546 | 0.017372 | 1.73716 | 0.261745 | 1.565996 | 0.409781 | 9.55599 | 0.117354 | 1.928191 | false | true | 2024-04-12 | 2024-06-12 | 0 | allenai/OLMo-1B-hf |
allenai_OLMo-2-1124-7B-Instruct_float16 | float16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | Olmo2ForCausalLM | <a target="_blank" href="https://huggingface.co/allenai/OLMo-2-1124-7B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/OLMo-2-1124-7B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__OLMo-2-1124-7B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allenai/OLMo-2-1124-7B-Instruct | 470b1fba1ae01581f270116362ee4aa1b97f4c84 | 21.785857 | apache-2.0 | 30 | 7.299 | true | false | false | true | 1.63589 | 0.724403 | 72.440347 | 0.402236 | 16.326773 | 0.148792 | 14.879154 | 0.278523 | 3.803132 | 0.350833 | 4.6875 | 0.267204 | 18.578236 | false | true | 2024-12-18 | 2025-01-07 | 1 | allenai/OLMo-2-1124-7B-Instruct (Merge) |
allenai_OLMo-7B-Instruct-hf_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | OlmoForCausalLM | <a target="_blank" href="https://huggingface.co/allenai/OLMo-7B-Instruct-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/OLMo-7B-Instruct-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__OLMo-7B-Instruct-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allenai/OLMo-7B-Instruct-hf | 2ea947518df93433aa71219f29b36c72ac63be95 | 10.848973 | apache-2.0 | 3 | 7 | true | false | false | true | 1.790599 | 0.347265 | 34.726526 | 0.370647 | 13.159933 | 0.013595 | 1.359517 | 0.270973 | 2.796421 | 0.376479 | 4.326563 | 0.178524 | 8.724882 | false | true | 2024-06-04 | 2024-06-27 | 0 | allenai/OLMo-7B-Instruct-hf |
allenai_OLMo-7B-hf_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | OlmoForCausalLM | <a target="_blank" href="https://huggingface.co/allenai/OLMo-7B-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/OLMo-7B-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__OLMo-7B-hf-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allenai/OLMo-7B-hf | 687d934d36a05417048d0fe7482f24f389fef6aa | 6.864268 | apache-2.0 | 13 | 6.888 | true | false | false | false | 1.181128 | 0.271927 | 27.192737 | 0.327913 | 5.761987 | 0.012085 | 1.208459 | 0.272651 | 3.020134 | 0.348667 | 2.083333 | 0.117271 | 1.918957 | false | true | 2024-04-12 | 2024-06-27 | 0 | allenai/OLMo-7B-hf |
allenai_OLMoE-1B-7B-0125-Instruct_float16 | float16 | 🟢 pretrained | 🟢 | Original | OlmoeForCausalLM | <a target="_blank" href="https://huggingface.co/allenai/OLMoE-1B-7B-0125-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/OLMoE-1B-7B-0125-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__OLMoE-1B-7B-0125-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allenai/OLMoE-1B-7B-0125-Instruct | b89a7c4bc24fb9e55ce2543c9458ce0ca5c4650e | 17.509876 | apache-2.0 | 43 | 6.919 | true | true | false | true | 2.608349 | 0.675744 | 67.574369 | 0.382453 | 14.007956 | 0.089879 | 8.987915 | 0.260067 | 1.342282 | 0.363583 | 2.98125 | 0.191489 | 10.165485 | false | true | 2025-01-27 | 2025-02-18 | 1 | allenai/OLMoE-1B-7B-0125-Instruct (Merge) |
allenai_OLMoE-1B-7B-0924_bfloat16 | bfloat16 | 🟢 pretrained | 🟢 | Original | OlmoeForCausalLM | <a target="_blank" href="https://huggingface.co/allenai/OLMoE-1B-7B-0924" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/OLMoE-1B-7B-0924</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__OLMoE-1B-7B-0924-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allenai/OLMoE-1B-7B-0924 | 4fa3a6e09ed0e41639962f38bfba0fc532b90075 | 7.266581 | apache-2.0 | 112 | 6.919 | true | true | false | false | 6.152814 | 0.218471 | 21.847143 | 0.339344 | 8.308107 | 0.016616 | 1.661631 | 0.247483 | 0 | 0.348792 | 3.565625 | 0.173953 | 8.216977 | false | true | 2024-07-20 | 2024-09-30 | 0 | allenai/OLMoE-1B-7B-0924 |
allenai_OLMoE-1B-7B-0924-Instruct_bfloat16 | bfloat16 | 💬 chat models (RLHF, DPO, IFT, ...) | 💬 | Original | OlmoeForCausalLM | <a target="_blank" href="https://huggingface.co/allenai/OLMoE-1B-7B-0924-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allenai/OLMoE-1B-7B-0924-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allenai__OLMoE-1B-7B-0924-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allenai/OLMoE-1B-7B-0924-Instruct | 7f1c97f440f06ce36705e4f2b843edb5925f4498 | 13.698377 | apache-2.0 | 88 | 6.919 | true | true | false | true | 8.245768 | 0.466742 | 46.674158 | 0.390161 | 14.571563 | 0.027946 | 2.794562 | 0.267617 | 2.348993 | 0.384823 | 6.069531 | 0.187583 | 9.731457 | false | true | 2024-08-13 | 2024-09-30 | 2 | allenai/OLMoE-1B-7B-0924 |
allknowingroger_Chocolatine-24B_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | Phi3ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Chocolatine-24B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Chocolatine-24B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Chocolatine-24B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Chocolatine-24B | 6245b82885ca4930575dbed2932ec1d32d901c0e | 21.345734 | apache-2.0 | 2 | 24.184 | true | false | false | false | 12.36992 | 0.195815 | 19.581488 | 0.619126 | 45.78594 | 0.000755 | 0.075529 | 0.325503 | 10.067114 | 0.432323 | 12.940365 | 0.456616 | 39.623966 | true | false | 2024-09-02 | 2024-09-02 | 1 | allknowingroger/Chocolatine-24B (Merge) |
allknowingroger_Gemma2Slerp1-2.6B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Gemma2Slerp1-2.6B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Gemma2Slerp1-2.6B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Gemma2Slerp1-2.6B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Gemma2Slerp1-2.6B | 2d0e85a03c55abd22963c5c3a44f180bfecebf7b | 20.670221 | 0 | 2.614 | false | false | false | false | 2.388527 | 0.535435 | 53.543487 | 0.434309 | 19.770255 | 0.106495 | 10.649547 | 0.283557 | 4.474273 | 0.456167 | 16.820833 | 0.268866 | 18.762928 | false | false | 2024-12-04 | 2024-12-06 | 1 | allknowingroger/Gemma2Slerp1-2.6B (Merge) |
|
allknowingroger_Gemma2Slerp1-27B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Gemma2Slerp1-27B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Gemma2Slerp1-27B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Gemma2Slerp1-27B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Gemma2Slerp1-27B | 4a5c5092f40cc161bb18ca2b9e30a653c768e062 | 36.507639 | apache-2.0 | 0 | 27.227 | true | false | false | false | 8.179935 | 0.718633 | 71.863323 | 0.63989 | 48.377666 | 0.258308 | 25.830816 | 0.364094 | 15.212528 | 0.476719 | 19.35651 | 0.445645 | 38.404994 | true | false | 2024-11-30 | 2024-12-06 | 1 | allknowingroger/Gemma2Slerp1-27B (Merge) |
allknowingroger_Gemma2Slerp2-2.6B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Gemma2Slerp2-2.6B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Gemma2Slerp2-2.6B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Gemma2Slerp2-2.6B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Gemma2Slerp2-2.6B | 12ca2fdb5dd866fbdc624057a176ad3d1f8c2293 | 21.294048 | 2 | 2.614 | false | false | false | false | 2.400023 | 0.574727 | 57.472728 | 0.430765 | 19.719839 | 0.090634 | 9.063444 | 0.305369 | 7.38255 | 0.446771 | 15.279688 | 0.269614 | 18.84604 | false | false | 2024-12-04 | 2024-12-06 | 1 | allknowingroger/Gemma2Slerp2-2.6B (Merge) |
|
allknowingroger_Gemma2Slerp2-27B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Gemma2Slerp2-27B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Gemma2Slerp2-27B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Gemma2Slerp2-27B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Gemma2Slerp2-27B | 21043f6eaf40680675461825fbdfc964f4a3c4a0 | 37.9317 | apache-2.0 | 1 | 27.227 | true | false | false | false | 8.823042 | 0.754553 | 75.455347 | 0.655727 | 51.090234 | 0.278701 | 27.870091 | 0.369966 | 15.995526 | 0.462083 | 16.927083 | 0.462267 | 40.251921 | true | false | 2024-11-30 | 2024-12-06 | 1 | allknowingroger/Gemma2Slerp2-27B (Merge) |
allknowingroger_Gemma2Slerp3-27B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Gemma2Slerp3-27B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Gemma2Slerp3-27B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Gemma2Slerp3-27B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Gemma2Slerp3-27B | cddd53f3b29a361be2350b76770a60b3fcc78059 | 37.531456 | apache-2.0 | 0 | 27.227 | true | false | false | false | 8.693278 | 0.742638 | 74.263842 | 0.649964 | 49.951521 | 0.274169 | 27.416918 | 0.354866 | 13.982103 | 0.474021 | 19.119271 | 0.464096 | 40.455083 | true | false | 2024-12-01 | 2024-12-06 | 1 | allknowingroger/Gemma2Slerp3-27B (Merge) |
allknowingroger_Gemma2Slerp4-27B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Gemma2Slerp4-27B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Gemma2Slerp4-27B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Gemma2Slerp4-27B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Gemma2Slerp4-27B | 5c89bb96e60f0297f5bf27fc10713a4dcdd54285 | 37.367554 | apache-2.0 | 0 | 27.227 | true | false | false | false | 8.813367 | 0.749658 | 74.965758 | 0.652958 | 50.773762 | 0.271903 | 27.190332 | 0.366611 | 15.548098 | 0.45024 | 15.179948 | 0.464927 | 40.547429 | true | false | 2024-12-01 | 2024-12-06 | 1 | allknowingroger/Gemma2Slerp4-27B (Merge) |
allknowingroger_GemmaSlerp-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/GemmaSlerp-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/GemmaSlerp-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__GemmaSlerp-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/GemmaSlerp-9B | 4f54819ae9c0af1f3e508f0afc88a7a734f9632d | 33.186118 | apache-2.0 | 0 | 9.242 | true | false | false | false | 3.21534 | 0.70432 | 70.432009 | 0.592058 | 41.556032 | 0.216012 | 21.601208 | 0.34396 | 12.527964 | 0.467323 | 17.882031 | 0.416057 | 35.117465 | true | false | 2024-10-27 | 2024-11-22 | 1 | allknowingroger/GemmaSlerp-9B (Merge) |
allknowingroger_GemmaSlerp2-9B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/GemmaSlerp2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/GemmaSlerp2-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__GemmaSlerp2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/GemmaSlerp2-9B | e93fb8d7fad0007e463e44365a5a82d0d6facd61 | 34.257578 | apache-2.0 | 3 | 9.242 | true | false | false | false | 3.570007 | 0.7281 | 72.810033 | 0.598271 | 42.541033 | 0.210725 | 21.072508 | 0.352349 | 13.646532 | 0.476719 | 19.489844 | 0.42387 | 35.98552 | true | false | 2024-10-29 | 2024-11-22 | 1 | allknowingroger/GemmaSlerp2-9B (Merge) |
allknowingroger_GemmaSlerp4-10B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/GemmaSlerp4-10B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/GemmaSlerp4-10B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__GemmaSlerp4-10B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/GemmaSlerp4-10B | e30d14d05730a83926263a7b0e4b1e002b6cd65a | 34.063012 | apache-2.0 | 2 | 10.159 | true | false | false | false | 3.837357 | 0.732622 | 73.262167 | 0.602786 | 43.328658 | 0.22432 | 22.432024 | 0.353188 | 13.758389 | 0.45399 | 15.482031 | 0.425033 | 36.114805 | true | false | 2024-10-30 | 2024-11-22 | 1 | allknowingroger/GemmaSlerp4-10B (Merge) |
allknowingroger_GemmaSlerp5-10B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/GemmaSlerp5-10B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/GemmaSlerp5-10B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__GemmaSlerp5-10B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/GemmaSlerp5-10B | 7e94afcde7cc1ae88105521a831abefe8126b0d1 | 34.382403 | apache-2.0 | 2 | 10.159 | true | false | false | false | 4.645722 | 0.735344 | 73.534444 | 0.605448 | 43.538464 | 0.218278 | 21.827795 | 0.352349 | 13.646532 | 0.460781 | 16.764323 | 0.432846 | 36.982861 | true | false | 2024-10-30 | 2024-11-22 | 1 | allknowingroger/GemmaSlerp5-10B (Merge) |
allknowingroger_GemmaStock1-27B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Gemma2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/GemmaStock1-27B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/GemmaStock1-27B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__GemmaStock1-27B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/GemmaStock1-27B | 8563301fe323c4d1060ae6f56d5737ad62a63fef | 37.513659 | apache-2.0 | 0 | 27.227 | true | false | false | false | 8.305481 | 0.750906 | 75.090648 | 0.656561 | 50.990136 | 0.263595 | 26.359517 | 0.364094 | 15.212528 | 0.452687 | 15.985937 | 0.472989 | 41.443189 | true | false | 2024-12-03 | 2024-12-06 | 1 | allknowingroger/GemmaStock1-27B (Merge) |
allknowingroger_HomerSlerp1-7B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/HomerSlerp1-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/HomerSlerp1-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__HomerSlerp1-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/HomerSlerp1-7B | 42e3df3d9a25d8ff0d470582395f165b2ddb83d8 | 28.483742 | apache-2.0 | 2 | 7.616 | true | false | false | false | 1.366564 | 0.462121 | 46.212051 | 0.551818 | 36.259863 | 0.271903 | 27.190332 | 0.317953 | 9.060403 | 0.435854 | 13.248438 | 0.450382 | 38.931368 | true | false | 2024-11-20 | 2024-11-22 | 1 | allknowingroger/HomerSlerp1-7B (Merge) |
allknowingroger_HomerSlerp2-7B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/HomerSlerp2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/HomerSlerp2-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__HomerSlerp2-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/HomerSlerp2-7B | 210acef73da0488ea270332f5831b609298a98f0 | 28.9489 | apache-2.0 | 1 | 7.616 | true | false | false | false | 1.249973 | 0.448682 | 44.868172 | 0.564894 | 37.9603 | 0.296828 | 29.682779 | 0.319631 | 9.284116 | 0.435573 | 12.846615 | 0.451463 | 39.051418 | true | false | 2024-11-20 | 2024-11-22 | 1 | allknowingroger/HomerSlerp2-7B (Merge) |
allknowingroger_HomerSlerp3-7B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/HomerSlerp3-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/HomerSlerp3-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__HomerSlerp3-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/HomerSlerp3-7B | 4f41686caa5bc39e3b0f075360974057486ece95 | 28.953653 | apache-2.0 | 1 | 7.616 | true | false | false | false | 1.234468 | 0.436267 | 43.626688 | 0.559806 | 37.290018 | 0.302115 | 30.21148 | 0.317114 | 8.948546 | 0.446177 | 14.372135 | 0.453457 | 39.27305 | true | false | 2024-11-21 | 2024-11-22 | 1 | allknowingroger/HomerSlerp3-7B (Merge) |
allknowingroger_HomerSlerp4-7B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/HomerSlerp4-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/HomerSlerp4-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__HomerSlerp4-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/HomerSlerp4-7B | f2ce1f2afa3c645e26ca61ee30f24736873bafa1 | 29.144845 | apache-2.0 | 0 | 7.616 | true | false | false | false | 1.35814 | 0.437416 | 43.741606 | 0.557077 | 36.786834 | 0.327039 | 32.703927 | 0.319631 | 9.284116 | 0.440844 | 13.772135 | 0.447224 | 38.580452 | true | false | 2024-11-21 | 2024-11-22 | 1 | allknowingroger/HomerSlerp4-7B (Merge) |
allknowingroger_LimyQstar-7B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/LimyQstar-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/LimyQstar-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__LimyQstar-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/LimyQstar-7B-slerp | 6dc557c7bfd6a6f9bc8190bc8a31c3b732deca40 | 18.672525 | apache-2.0 | 0 | 7.242 | true | false | false | false | 1.260648 | 0.349114 | 34.911369 | 0.502356 | 30.194567 | 0.068731 | 6.873112 | 0.298658 | 6.487696 | 0.414646 | 10.197396 | 0.310339 | 23.371011 | true | false | 2024-03-23 | 2024-06-26 | 1 | allknowingroger/LimyQstar-7B-slerp (Merge) |
allknowingroger_Llama3.1-60B_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | LlamaForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Llama3.1-60B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Llama3.1-60B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Llama3.1-60B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Llama3.1-60B | 5fb1ddcce0bddc60949a9d0c2fc9f8326be5bc4e | 9.951594 | 0 | 61.997 | false | false | false | false | 26.983717 | 0.181452 | 18.145188 | 0.324176 | 7.784283 | 0 | 0 | 0.294463 | 5.928412 | 0.359583 | 2.18125 | 0.331034 | 25.670434 | false | false | 2024-10-08 | 0 | Removed |
||
allknowingroger_Marco-01-slerp1-7B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Marco-01-slerp1-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Marco-01-slerp1-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Marco-01-slerp1-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Marco-01-slerp1-7B | 12070d5f5bbd891024cb02c363759430ffd3dfba | 29.485317 | apache-2.0 | 0 | 7.616 | true | false | false | false | 1.274648 | 0.468116 | 46.811571 | 0.554094 | 36.231847 | 0.31571 | 31.570997 | 0.317114 | 8.948546 | 0.445188 | 14.648438 | 0.448305 | 38.700502 | true | false | 2024-11-22 | 2024-11-22 | 1 | allknowingroger/Marco-01-slerp1-7B (Merge) |
allknowingroger_Meme-7B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Meme-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Meme-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Meme-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Meme-7B-slerp | 7836c0f4fce70286382e61003e9a05d7559365d9 | 19.276081 | apache-2.0 | 0 | 7.242 | true | false | false | false | 0.964901 | 0.516375 | 51.637544 | 0.466094 | 24.529486 | 0.043807 | 4.380665 | 0.286074 | 4.809843 | 0.422302 | 10.18776 | 0.281001 | 20.111185 | true | false | 2024-05-22 | 2024-06-26 | 1 | allknowingroger/Meme-7B-slerp (Merge) |
allknowingroger_Ministral-8B-slerp_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Ministral-8B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Ministral-8B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Ministral-8B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Ministral-8B-slerp | 51c40046c0f9fead83485ae83b6c0d03f4ae47f2 | 14.900965 | 0 | 7.248 | false | false | false | false | 2.250197 | 0.19609 | 19.608971 | 0.468602 | 25.195565 | 0.003776 | 0.377644 | 0.312081 | 8.277405 | 0.428531 | 12.39974 | 0.311918 | 23.546469 | false | false | 2024-10-21 | 0 | Removed |
||
allknowingroger_MistralPhi3-11B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/MistralPhi3-11B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MistralPhi3-11B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MistralPhi3-11B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/MistralPhi3-11B | 3afeaf24c6306c4752c320c4fd32fa2e7694e12e | 21.627095 | apache-2.0 | 0 | 11.234 | true | false | false | false | 1.414075 | 0.194291 | 19.429115 | 0.623431 | 46.164629 | 0 | 0 | 0.332215 | 10.961969 | 0.426677 | 12.234635 | 0.46875 | 40.972222 | true | false | 2024-08-26 | 2024-09-02 | 1 | allknowingroger/MistralPhi3-11B (Merge) |
allknowingroger_Mistralmash1-7B-s_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Mistralmash1-7B-s" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Mistralmash1-7B-s</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Mistralmash1-7B-s-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Mistralmash1-7B-s | 730b7b2867deef63961f002b6e1e70e7d416c599 | 20.913866 | apache-2.0 | 1 | 7.242 | true | false | false | false | 1.322377 | 0.3961 | 39.610013 | 0.527749 | 33.448554 | 0.092145 | 9.214502 | 0.294463 | 5.928412 | 0.426708 | 11.805208 | 0.329289 | 25.476507 | true | false | 2024-08-27 | 2024-09-02 | 1 | allknowingroger/Mistralmash1-7B-s (Merge) |
allknowingroger_Mistralmash2-7B-s_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Mistralmash2-7B-s" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Mistralmash2-7B-s</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Mistralmash2-7B-s-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Mistralmash2-7B-s | 3b2aafa0f931f3d3103fbc96a6da4ac36f376d78 | 21.389681 | apache-2.0 | 0 | 7.242 | true | false | false | false | 1.331074 | 0.410188 | 41.01883 | 0.530486 | 33.298364 | 0.079305 | 7.930514 | 0.297819 | 6.375839 | 0.43725 | 13.65625 | 0.334525 | 26.058289 | true | false | 2024-08-27 | 2024-09-02 | 1 | allknowingroger/Mistralmash2-7B-s (Merge) |
allknowingroger_MixTAO-19B-pass_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/MixTAO-19B-pass" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MixTAO-19B-pass</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MixTAO-19B-pass-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/MixTAO-19B-pass | a41369cfcfbada9d5387051ba616bf1432b31d31 | 20.627592 | apache-2.0 | 2 | 19.188 | true | false | false | false | 2.510265 | 0.381437 | 38.143681 | 0.512825 | 31.577918 | 0.061178 | 6.117825 | 0.284396 | 4.58613 | 0.478271 | 19.950521 | 0.310505 | 23.38948 | true | false | 2024-06-02 | 2024-06-26 | 1 | allknowingroger/MixTAO-19B-pass (Merge) |
allknowingroger_MixTaoTruthful-13B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/MixTaoTruthful-13B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MixTaoTruthful-13B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MixTaoTruthful-13B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/MixTaoTruthful-13B-slerp | 3324d37e138c6bf0d6891e54b6dd839c8d2f35ec | 20.252976 | apache-2.0 | 0 | 12.879 | true | false | false | false | 1.616217 | 0.413885 | 41.388516 | 0.520734 | 32.706362 | 0.066465 | 6.646526 | 0.284396 | 4.58613 | 0.42925 | 12.85625 | 0.310007 | 23.334072 | true | false | 2024-05-25 | 2024-06-26 | 1 | allknowingroger/MixTaoTruthful-13B-slerp (Merge) |
allknowingroger_MultiCalm-7B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/MultiCalm-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MultiCalm-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MultiCalm-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/MultiCalm-7B-slerp | 1c23540e907fab4dfe0ef66edd0003e764bfe568 | 19.472289 | apache-2.0 | 0 | 7.242 | true | false | false | false | 1.233147 | 0.392653 | 39.265261 | 0.512189 | 31.466483 | 0.061934 | 6.193353 | 0.282718 | 4.362416 | 0.431948 | 12.960156 | 0.303275 | 22.586067 | true | false | 2024-05-19 | 2024-06-26 | 1 | allknowingroger/MultiCalm-7B-slerp (Merge) |
allknowingroger_MultiMash-12B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/MultiMash-12B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MultiMash-12B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MultiMash-12B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/MultiMash-12B-slerp | 91a6d0fe6b9271000ca713ee9ab414c782ba4c50 | 20.179904 | apache-2.0 | 0 | 12.879 | true | false | false | false | 1.68396 | 0.397449 | 39.744877 | 0.514183 | 31.925677 | 0.080816 | 8.081571 | 0.276846 | 3.579418 | 0.443792 | 14.773958 | 0.306765 | 22.973921 | true | false | 2024-05-20 | 2024-06-26 | 1 | allknowingroger/MultiMash-12B-slerp (Merge) |
allknowingroger_MultiMash10-13B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/MultiMash10-13B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MultiMash10-13B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MultiMash10-13B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/MultiMash10-13B-slerp | 6def2fd1a11d4c380a19b7a3bdf263a6b80cd8f3 | 20.426436 | apache-2.0 | 0 | 12.879 | true | false | false | false | 1.758718 | 0.416283 | 41.628324 | 0.518634 | 32.452502 | 0.071752 | 7.175227 | 0.286074 | 4.809843 | 0.431792 | 12.973958 | 0.311669 | 23.518765 | true | false | 2024-05-27 | 2024-06-26 | 1 | allknowingroger/MultiMash10-13B-slerp (Merge) |
allknowingroger_MultiMash11-13B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/MultiMash11-13B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MultiMash11-13B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MultiMash11-13B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/MultiMash11-13B-slerp | 1134a0adabef4a26e1d49c302baff74c4a7e9f46 | 20.614676 | apache-2.0 | 0 | 12.879 | true | false | false | false | 1.971477 | 0.425101 | 42.510095 | 0.519386 | 32.596703 | 0.070242 | 7.024169 | 0.282718 | 4.362416 | 0.437281 | 14.026823 | 0.308511 | 23.167849 | true | false | 2024-05-27 | 2024-06-26 | 1 | allknowingroger/MultiMash11-13B-slerp (Merge) |
allknowingroger_MultiMash2-12B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/MultiMash2-12B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MultiMash2-12B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MultiMash2-12B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/MultiMash2-12B-slerp | e44e9563368699f753a4474b068c059d233ddee3 | 19.840143 | apache-2.0 | 0 | 12.879 | true | false | false | false | 1.640279 | 0.426075 | 42.607504 | 0.513397 | 31.61795 | 0.064199 | 6.41994 | 0.279362 | 3.914989 | 0.422802 | 11.783594 | 0.304272 | 22.696882 | true | false | 2024-05-20 | 2024-06-26 | 1 | allknowingroger/MultiMash2-12B-slerp (Merge) |
allknowingroger_MultiMash5-12B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/MultiMash5-12B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MultiMash5-12B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MultiMash5-12B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/MultiMash5-12B-slerp | 15ef0301c7ce939208d55ad13fa840662f92bce6 | 19.590305 | 0 | 12.879 | false | false | false | false | 1.608333 | 0.41416 | 41.415998 | 0.514453 | 31.856364 | 0.063444 | 6.344411 | 0.277685 | 3.691275 | 0.420292 | 11.703125 | 0.302776 | 22.530659 | false | false | 2024-06-26 | 0 | Removed |
||
allknowingroger_MultiMash6-12B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/MultiMash6-12B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MultiMash6-12B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MultiMash6-12B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/MultiMash6-12B-slerp | a04856a12b85e986e1b540cf0c7510e9ce2df09b | 20.276427 | apache-2.0 | 0 | 12.879 | true | false | false | false | 1.649145 | 0.430047 | 43.004672 | 0.519592 | 32.40388 | 0.072508 | 7.250755 | 0.274329 | 3.243848 | 0.430583 | 12.522917 | 0.309092 | 23.232491 | true | false | 2024-05-22 | 2024-06-26 | 1 | allknowingroger/MultiMash6-12B-slerp (Merge) |
allknowingroger_MultiMash7-12B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/MultiMash7-12B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MultiMash7-12B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MultiMash7-12B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/MultiMash7-12B-slerp | 5f91dd41fb4b58e76c52b03ed15477a046b079df | 19.792294 | apache-2.0 | 0 | 12.879 | true | false | false | false | 1.643543 | 0.421279 | 42.127887 | 0.511114 | 31.29815 | 0.069486 | 6.94864 | 0.278523 | 3.803132 | 0.427948 | 12.026823 | 0.302942 | 22.549128 | true | false | 2024-05-22 | 2024-06-26 | 1 | allknowingroger/MultiMash7-12B-slerp (Merge) |
allknowingroger_MultiMash8-13B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/MultiMash8-13B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MultiMash8-13B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MultiMash8-13B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/MultiMash8-13B-slerp | 5590ccd99f74301951f450f9d0271a99e97728c8 | 21.074864 | apache-2.0 | 0 | 12.879 | true | false | false | false | 2.598432 | 0.43207 | 43.207024 | 0.517848 | 32.272997 | 0.077039 | 7.703927 | 0.288591 | 5.145414 | 0.442396 | 14.499479 | 0.312583 | 23.620346 | true | false | 2024-05-26 | 2024-09-02 | 1 | allknowingroger/MultiMash8-13B-slerp (Merge) |
allknowingroger_MultiMash9-13B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/MultiMash9-13B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MultiMash9-13B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MultiMash9-13B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/MultiMash9-13B-slerp | 56dac45f387669baa04a8997ebb9ea63c65fbbd1 | 20.64297 | apache-2.0 | 0 | 12.879 | true | false | false | false | 1.731462 | 0.418781 | 41.878106 | 0.519358 | 32.552612 | 0.07855 | 7.854985 | 0.280201 | 4.026846 | 0.439823 | 14.211198 | 0.310007 | 23.334072 | true | false | 2024-05-26 | 2024-06-26 | 1 | allknowingroger/MultiMash9-13B-slerp (Merge) |
allknowingroger_MultiMerge-7B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/MultiMerge-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MultiMerge-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MultiMerge-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/MultiMerge-7B-slerp | a026bbea09f0b1880deed62b9081e3708be0dec2 | 19.542247 | apache-2.0 | 0 | 7.242 | true | false | false | false | 1.226344 | 0.394776 | 39.477586 | 0.514022 | 31.803983 | 0.066465 | 6.646526 | 0.282718 | 4.362416 | 0.427979 | 12.330729 | 0.30369 | 22.63224 | true | false | 2024-04-11 | 2024-06-26 | 1 | allknowingroger/MultiMerge-7B-slerp (Merge) |
allknowingroger_Multimash3-12B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Multimash3-12B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Multimash3-12B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Multimash3-12B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Multimash3-12B-slerp | 0b90bf0b5230d02b4ba63879fc3bf0b85d46c3ce | 20.470733 | apache-2.0 | 0 | 12.879 | true | false | false | false | 1.689316 | 0.44371 | 44.371047 | 0.517662 | 32.150891 | 0.062689 | 6.268882 | 0.280201 | 4.026846 | 0.434396 | 13.032813 | 0.306765 | 22.973921 | true | false | 2024-05-21 | 2024-06-26 | 1 | allknowingroger/Multimash3-12B-slerp (Merge) |
allknowingroger_Multimerge-19B-pass_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MixtralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Multimerge-19B-pass" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Multimerge-19B-pass</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Multimerge-19B-pass-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Multimerge-19B-pass | e75918ed5601f400f62601cf6c0887aa936e8a52 | 4.536203 | 0 | 19.188 | false | false | false | false | 3.930758 | 0.177305 | 17.730511 | 0.289178 | 2.080374 | 0 | 0 | 0.259228 | 1.230425 | 0.342958 | 4.303125 | 0.116855 | 1.872784 | false | false | 2024-06-26 | 0 | Removed |
||
allknowingroger_MultiverseEx26-7B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/MultiverseEx26-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/MultiverseEx26-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__MultiverseEx26-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/MultiverseEx26-7B-slerp | 43f18d84e025693f00e9be335bf12fce96089b2f | 19.695899 | apache-2.0 | 1 | 7.242 | true | false | false | false | 1.210985 | 0.393852 | 39.385165 | 0.513359 | 31.663775 | 0.075529 | 7.55287 | 0.282718 | 4.362416 | 0.429313 | 12.597396 | 0.303524 | 22.613771 | true | false | 2024-03-30 | 2024-06-26 | 1 | allknowingroger/MultiverseEx26-7B-slerp (Merge) |
allknowingroger_NeuralWestSeverus-7B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/NeuralWestSeverus-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/NeuralWestSeverus-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__NeuralWestSeverus-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/NeuralWestSeverus-7B-slerp | 5ee5d6a11ffc4f9733e78994169a2e1614d5e16e | 20.675371 | apache-2.0 | 0 | 7.242 | true | false | false | false | 1.168093 | 0.41356 | 41.356046 | 0.524428 | 33.414467 | 0.073263 | 7.326284 | 0.270973 | 2.796421 | 0.452875 | 15.409375 | 0.313747 | 23.749631 | true | false | 2024-05-16 | 2024-06-26 | 1 | allknowingroger/NeuralWestSeverus-7B-slerp (Merge) |
allknowingroger_Neuralcoven-7B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Neuralcoven-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Neuralcoven-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Neuralcoven-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Neuralcoven-7B-slerp | 129b40a7fd816f679ef5d4ab29fc77345f33a7b1 | 20.36367 | apache-2.0 | 0 | 7.242 | true | false | false | false | 1.263822 | 0.385858 | 38.585841 | 0.530287 | 33.799135 | 0.07855 | 7.854985 | 0.285235 | 4.697987 | 0.429 | 11.758333 | 0.329372 | 25.485742 | true | false | 2024-05-17 | 2024-06-26 | 1 | allknowingroger/Neuralcoven-7B-slerp (Merge) |
allknowingroger_Neuralmultiverse-7B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Neuralmultiverse-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Neuralmultiverse-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Neuralmultiverse-7B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Neuralmultiverse-7B-slerp | a65fe05e26e10a488b08264ac8ed73a49c3f263a | 19.36103 | apache-2.0 | 0 | 7.242 | true | false | false | false | 1.234319 | 0.376915 | 37.691547 | 0.516572 | 32.10018 | 0.064955 | 6.495468 | 0.284396 | 4.58613 | 0.428042 | 12.605208 | 0.304189 | 22.687648 | true | false | 2024-05-17 | 2024-06-26 | 1 | allknowingroger/Neuralmultiverse-7B-slerp (Merge) |
allknowingroger_Ph3della5-14B_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | Phi3ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Ph3della5-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Ph3della5-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Ph3della5-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Ph3della5-14B | 9c6819a910d4da414dd67c10da3bff3f986fefa5 | 30.469739 | apache-2.0 | 0 | 13.96 | true | false | false | false | 2.09185 | 0.479856 | 47.985567 | 0.633175 | 48.414364 | 0.176737 | 17.673716 | 0.342282 | 12.304251 | 0.438615 | 14.360156 | 0.478723 | 42.080378 | true | false | 2024-09-05 | 2024-10-08 | 1 | allknowingroger/Ph3della5-14B (Merge) |
allknowingroger_Ph3merge-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Phi3ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Ph3merge-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Ph3merge-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Ph3merge-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Ph3merge-14B | 6d0ddaa4e0cf4c82d7149cc726b08be5753a760a | 23.683333 | apache-2.0 | 0 | 13.619 | true | false | false | false | 4.024665 | 0.270129 | 27.012881 | 0.638088 | 48.882424 | 0.010574 | 1.057402 | 0.338087 | 11.744966 | 0.433438 | 13.279688 | 0.461104 | 40.122636 | true | false | 2024-08-30 | 2024-09-02 | 1 | allknowingroger/Ph3merge-14B (Merge) |
allknowingroger_Ph3merge2-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Phi3ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Ph3merge2-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Ph3merge2-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Ph3merge2-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Ph3merge2-14B | 2256ab821e286a1d8a4f0d42e00a50013e119671 | 7.962731 | 0 | 13.619 | false | false | false | false | 4.122009 | 0.170611 | 17.061065 | 0.360694 | 10.549968 | 0 | 0 | 0.291107 | 5.480984 | 0.391083 | 6.652083 | 0.172291 | 8.032284 | false | false | 2024-10-08 | 0 | Removed |
||
allknowingroger_Ph3merge3-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Phi3ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Ph3merge3-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Ph3merge3-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Ph3merge3-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Ph3merge3-14B | 90a036f7f136932ea525b5fd26cf2f54a66141af | 7.931824 | 0 | 13.619 | false | false | false | false | 3.978513 | 0.164516 | 16.451571 | 0.359743 | 10.39138 | 0 | 0 | 0.285235 | 4.697987 | 0.408198 | 8.858073 | 0.164727 | 7.191933 | false | false | 2024-09-02 | 0 | Removed |
||
allknowingroger_Ph3task1-14B_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | Phi3ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Ph3task1-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Ph3task1-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Ph3task1-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Ph3task1-14B | c9a5bab157dbdd281c651a5b7ea82a8bc64aa420 | 30.548398 | apache-2.0 | 0 | 13.96 | true | false | false | false | 2.017697 | 0.469464 | 46.946435 | 0.631781 | 47.926908 | 0.166918 | 16.691843 | 0.350671 | 13.422819 | 0.450771 | 16.813021 | 0.473404 | 41.489362 | true | false | 2024-09-07 | 2024-10-08 | 1 | allknowingroger/Ph3task1-14B (Merge) |
allknowingroger_Ph3task2-14B_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | Phi3ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Ph3task2-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Ph3task2-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Ph3task2-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Ph3task2-14B | 2193bfec75bc90e87bc57863e02deefbdd195f9f | 28.611111 | apache-2.0 | 0 | 13.96 | true | false | false | false | 1.870132 | 0.471313 | 47.131278 | 0.609841 | 44.081796 | 0.146526 | 14.652568 | 0.330537 | 10.738255 | 0.4535 | 16.620833 | 0.445977 | 38.441933 | true | false | 2024-09-08 | 2024-10-08 | 1 | allknowingroger/Ph3task2-14B (Merge) |
allknowingroger_Ph3task3-14B_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | Phi3ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Ph3task3-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Ph3task3-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Ph3task3-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Ph3task3-14B | 359de5c4969057206f846a41c72073b3429317fd | 30.710222 | apache-2.0 | 0 | 13.96 | true | false | false | false | 2.07523 | 0.496242 | 49.624219 | 0.629792 | 47.998499 | 0.175982 | 17.598187 | 0.341443 | 12.192394 | 0.442552 | 14.952344 | 0.477061 | 41.895686 | true | false | 2024-09-08 | 2024-10-08 | 1 | allknowingroger/Ph3task3-14B (Merge) |
allknowingroger_Ph3unsloth-3B-slerp_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Ph3unsloth-3B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Ph3unsloth-3B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Ph3unsloth-3B-slerp-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Ph3unsloth-3B-slerp | 465444b3cdd43876717f7386ea2f3357c5fe8e53 | 20.153515 | apache-2.0 | 0 | 3.821 | true | false | false | false | 1.058084 | 0.189445 | 18.944512 | 0.546808 | 36.458773 | 0.101208 | 10.120846 | 0.324664 | 9.955257 | 0.452781 | 15.43099 | 0.370096 | 30.010712 | true | false | 2024-05-31 | 2024-06-26 | 1 | allknowingroger/Ph3unsloth-3B-slerp (Merge) |
allknowingroger_Phi3mash1-17B-pass_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | MistralForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Phi3mash1-17B-pass" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Phi3mash1-17B-pass</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Phi3mash1-17B-pass-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Phi3mash1-17B-pass | fcd265996f026475c15fa44833e0481dc610e469 | 21.349969 | apache-2.0 | 0 | 16.687 | true | false | false | false | 2.916253 | 0.188421 | 18.842117 | 0.612888 | 45.250419 | 0 | 0 | 0.319631 | 9.284116 | 0.445125 | 14.840625 | 0.458943 | 39.882535 | true | false | 2024-08-28 | 2024-09-02 | 1 | allknowingroger/Phi3mash1-17B-pass (Merge) |
allknowingroger_Quen2-65B_float16 | float16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Quen2-65B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Quen2-65B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Quen2-65B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Quen2-65B | 2259cd8ea037d0e590920e7106b0fd1641a96c1d | 3.531344 | 0 | 63.923 | false | false | false | false | 26.634847 | 0.175781 | 17.578137 | 0.275652 | 1.23986 | 0 | 0 | 0.235738 | 0 | 0.320854 | 1.106771 | 0.11137 | 1.263298 | false | false | 2024-09-19 | 0 | Removed |
||
allknowingroger_Qwen2.5-42B-AGI_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Qwen2.5-42B-AGI" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Qwen2.5-42B-AGI</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Qwen2.5-42B-AGI-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Qwen2.5-42B-AGI | 8939b021a9d84bc2e4ae0ea4f351d807f35b91d7 | 4.47083 | 0 | 42.516 | false | false | false | false | 17.713962 | 0.191294 | 19.129355 | 0.29421 | 2.235886 | 0 | 0 | 0.260067 | 1.342282 | 0.362031 | 2.253906 | 0.116772 | 1.863549 | false | false | 2024-10-21 | 0 | Removed |
||
allknowingroger_Qwen2.5-7B-task2_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Qwen2.5-7B-task2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Qwen2.5-7B-task2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Qwen2.5-7B-task2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Qwen2.5-7B-task2 | 6f3ae972b2bbde0383c3a774e0e788a1af0dabc5 | 29.877934 | apache-2.0 | 1 | 7.616 | true | false | false | false | 1.383779 | 0.452703 | 45.270327 | 0.562594 | 37.52855 | 0.354985 | 35.498489 | 0.316275 | 8.836689 | 0.436969 | 13.054427 | 0.451712 | 39.079122 | true | false | 2024-10-31 | 2024-11-04 | 1 | allknowingroger/Qwen2.5-7B-task2 (Merge) |
allknowingroger_Qwen2.5-7B-task3_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Qwen2.5-7B-task3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Qwen2.5-7B-task3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Qwen2.5-7B-task3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Qwen2.5-7B-task3 | b1e524004242cdeec838ba21bce44ebb8598c12f | 28.738761 | apache-2.0 | 0 | 7.616 | true | false | false | false | 1.246416 | 0.512904 | 51.290354 | 0.539762 | 34.385984 | 0.260574 | 26.057402 | 0.317114 | 8.948546 | 0.435573 | 12.846615 | 0.450133 | 38.903664 | true | false | 2024-11-01 | 2024-11-04 | 1 | allknowingroger/Qwen2.5-7B-task3 (Merge) |
allknowingroger_Qwen2.5-7B-task4_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Qwen2.5-7B-task4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Qwen2.5-7B-task4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Qwen2.5-7B-task4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Qwen2.5-7B-task4 | ef4fe9331a0b9c34d829fcd5b1a09a7056e9300f | 30.061809 | apache-2.0 | 1 | 7.616 | true | false | false | false | 1.324199 | 0.500539 | 50.053857 | 0.558345 | 37.025269 | 0.311178 | 31.117825 | 0.32047 | 9.395973 | 0.439542 | 13.209375 | 0.456117 | 39.568558 | true | false | 2024-11-01 | 2024-11-04 | 1 | allknowingroger/Qwen2.5-7B-task4 (Merge) |
allknowingroger_Qwen2.5-7B-task7_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Qwen2.5-7B-task7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Qwen2.5-7B-task7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Qwen2.5-7B-task7-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Qwen2.5-7B-task7 | 090a873c77ed291867ddaf20249ed7f479ba4ba9 | 24.01694 | apache-2.0 | 1 | 7.616 | true | false | false | false | 1.356517 | 0.428423 | 42.842325 | 0.555243 | 37.51817 | 0.064955 | 6.495468 | 0.32047 | 9.395973 | 0.432563 | 13.036979 | 0.413314 | 34.812722 | true | false | 2024-11-04 | 2024-11-04 | 1 | allknowingroger/Qwen2.5-7B-task7 (Merge) |
allknowingroger_Qwen2.5-7B-task8_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Qwen2.5-7B-task8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Qwen2.5-7B-task8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Qwen2.5-7B-task8-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Qwen2.5-7B-task8 | 489a9a6fc98001026d9b96563d715cad43aabc8c | 30.109425 | apache-2.0 | 1 | 7.616 | true | false | false | false | 1.395552 | 0.464519 | 46.451859 | 0.55249 | 36.092737 | 0.352719 | 35.271903 | 0.32047 | 9.395973 | 0.451448 | 15.297656 | 0.443318 | 38.146424 | true | false | 2024-11-04 | 2024-11-04 | 1 | allknowingroger/Qwen2.5-7B-task8 (Merge) |
allknowingroger_Qwen2.5-slerp-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Qwen2.5-slerp-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Qwen2.5-slerp-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Qwen2.5-slerp-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Qwen2.5-slerp-14B | a44b0ea8291b62785152c2fe6ab336f5da672d1e | 38.162776 | apache-2.0 | 0 | 14.77 | true | false | false | false | 4.733764 | 0.49282 | 49.282016 | 0.651242 | 49.789537 | 0.462236 | 46.223565 | 0.36745 | 15.659955 | 0.474396 | 19.366146 | 0.537899 | 48.655437 | true | false | 2024-10-17 | 2024-10-21 | 1 | allknowingroger/Qwen2.5-slerp-14B (Merge) |
allknowingroger_QwenSlerp12-7B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/QwenSlerp12-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/QwenSlerp12-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__QwenSlerp12-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/QwenSlerp12-7B | be6510452755c2c8e559333ecaf68dc6b37637d9 | 29.989027 | apache-2.0 | 1 | 7.616 | true | false | false | false | 1.366068 | 0.507558 | 50.755772 | 0.555645 | 36.411303 | 0.294562 | 29.456193 | 0.315436 | 8.724832 | 0.459479 | 16.134896 | 0.446061 | 38.451167 | true | false | 2024-11-18 | 2024-11-22 | 1 | allknowingroger/QwenSlerp12-7B (Merge) |
allknowingroger_QwenSlerp4-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/QwenSlerp4-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/QwenSlerp4-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__QwenSlerp4-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/QwenSlerp4-14B | 3a55f52f639fc380a829b7cace5be3c96fcad730 | 38.798744 | 1 | 14.766 | false | false | false | false | 3.758324 | 0.632754 | 63.275442 | 0.648325 | 49.38124 | 0.369335 | 36.933535 | 0.372483 | 16.331096 | 0.464969 | 17.58776 | 0.543551 | 49.283392 | false | false | 2024-11-27 | 2024-12-06 | 1 | allknowingroger/QwenSlerp4-14B (Merge) |
|
allknowingroger_QwenSlerp5-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/QwenSlerp5-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/QwenSlerp5-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__QwenSlerp5-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/QwenSlerp5-14B | f1eac24bb5338ae11951d38ba09ff71f4d319cc9 | 39.358258 | 1 | 14.766 | false | false | false | false | 3.723625 | 0.711939 | 71.193877 | 0.635657 | 47.38764 | 0.356495 | 35.649547 | 0.364933 | 15.324385 | 0.467542 | 17.809375 | 0.539063 | 48.784722 | false | false | 2024-11-27 | 2024-12-06 | 1 | allknowingroger/QwenSlerp5-14B (Merge) |
|
allknowingroger_QwenSlerp6-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/QwenSlerp6-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/QwenSlerp6-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__QwenSlerp6-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/QwenSlerp6-14B | eff132ab6d7f612b46c47b29966f8391cea7b407 | 39.534571 | apache-2.0 | 1 | 14.766 | true | false | false | false | 3.481654 | 0.686685 | 68.668466 | 0.638445 | 47.588317 | 0.372356 | 37.23565 | 0.373322 | 16.442953 | 0.468969 | 18.321094 | 0.540559 | 48.950946 | true | false | 2024-11-28 | 2024-12-06 | 1 | allknowingroger/QwenSlerp6-14B (Merge) |
allknowingroger_QwenStock1-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/QwenStock1-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/QwenStock1-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__QwenStock1-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/QwenStock1-14B | 79daacc58a5ec97f297c4a99dbb31d19ae4c59ca | 38.145649 | apache-2.0 | 2 | 14.766 | true | false | false | false | 4.080549 | 0.563412 | 56.341175 | 0.652849 | 50.076293 | 0.376888 | 37.688822 | 0.376678 | 16.89038 | 0.472969 | 18.78776 | 0.541805 | 49.089465 | true | false | 2024-11-28 | 2024-12-06 | 1 | allknowingroger/QwenStock1-14B (Merge) |
allknowingroger_QwenStock2-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/QwenStock2-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/QwenStock2-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__QwenStock2-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/QwenStock2-14B | 69fd5f98c812cfa26d8514349669158a93058bf7 | 38.419175 | apache-2.0 | 1 | 14.766 | true | false | false | false | 4.042088 | 0.556343 | 55.634273 | 0.656885 | 50.598276 | 0.388218 | 38.821752 | 0.379195 | 17.225951 | 0.475604 | 19.283854 | 0.540559 | 48.950946 | true | false | 2024-11-29 | 2024-12-06 | 1 | allknowingroger/QwenStock2-14B (Merge) |
allknowingroger_QwenStock3-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/QwenStock3-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/QwenStock3-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__QwenStock3-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/QwenStock3-14B | 834fbf35e01efc44e4f2c8c372d7c1412754c0fa | 38.320004 | apache-2.0 | 1 | 14.766 | true | false | false | false | 4.299284 | 0.561513 | 56.151345 | 0.656532 | 50.576674 | 0.377644 | 37.76435 | 0.378356 | 17.114094 | 0.475573 | 19.113281 | 0.542803 | 49.200281 | true | false | 2024-11-29 | 2024-12-06 | 1 | allknowingroger/QwenStock3-14B (Merge) |
allknowingroger_Qwenslerp2-14B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Qwenslerp2-14B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Qwenslerp2-14B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Qwenslerp2-14B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Qwenslerp2-14B | 38e902c114b5640509a8615fc2a2546e07a5fb3f | 38.085983 | apache-2.0 | 1 | 14.77 | true | false | false | false | 4.528887 | 0.500714 | 50.071366 | 0.655488 | 50.303692 | 0.445619 | 44.561934 | 0.368289 | 15.771812 | 0.472938 | 18.883854 | 0.540309 | 48.923242 | true | false | 2024-10-19 | 2024-10-21 | 1 | allknowingroger/Qwenslerp2-14B (Merge) |
allknowingroger_Qwenslerp2-7B_bfloat16 | bfloat16 | 🤝 base merges and moerges | 🤝 | Original | Qwen2ForCausalLM | <a target="_blank" href="https://huggingface.co/allknowingroger/Qwenslerp2-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">allknowingroger/Qwenslerp2-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/allknowingroger__Qwenslerp2-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a> | allknowingroger/Qwenslerp2-7B | 46fe65fc2567b2430fa421478d47134ffe55c8f8 | 30.810469 | apache-2.0 | 0 | 7.616 | true | false | false | false | 1.29231 | 0.52944 | 52.943966 | 0.560913 | 37.437245 | 0.342145 | 34.214502 | 0.312919 | 8.389262 | 0.435604 | 12.817188 | 0.451546 | 39.060653 | true | false | 2024-10-31 | 2024-11-04 | 1 | allknowingroger/Qwenslerp2-7B (Merge) |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.