eval_name
stringlengths
9
97
Precision
stringclasses
5 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
3 values
Architecture
stringclasses
53 values
Model
stringlengths
355
611
fullname
stringlengths
4
89
Model sha
stringlengths
0
40
Average ⬆️
float64
27
81.3
Hub License
stringclasses
35 values
Hub ❤️
int64
0
4.88k
#Params (B)
int64
0
238
Available on the hub
bool
2 classes
Merged
bool
2 classes
MoE
bool
2 classes
Flagged
bool
1 class
date
stringlengths
0
26
Chat Template
bool
2 classes
ARC
float64
19.7
87.5
HellaSwag
float64
20.7
92.8
MMLU
float64
17.8
89.4
TruthfulQA
float64
27.9
82.3
Winogrande
float64
47.2
91.5
GSM8K
float64
0
88.2
Maintainers Choice
bool
2 classes
fierysurf_Ambari-7B-Instruct-v0.1-sharded_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/fierysurf/Ambari-7B-Instruct-v0.1-sharded" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fierysurf/Ambari-7B-Instruct-v0.1-sharded</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_fierysurf__Ambari-7B-Instruct-v0.1-sharded" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fierysurf/Ambari-7B-Instruct-v0.1-sharded
d5f311d103dab0eeac1d5208130645c5a3dbfcd5
45.740276
mit
0
6
true
true
true
true
2024-01-18T08:54:39Z
false
50
74.586736
38.032714
40.392499
69.534333
1.895375
false
fierysurf_Ambari-7B-base-v0.1-sharded_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/fierysurf/Ambari-7B-base-v0.1-sharded" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fierysurf/Ambari-7B-base-v0.1-sharded</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_fierysurf__Ambari-7B-base-v0.1-sharded" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fierysurf/Ambari-7B-base-v0.1-sharded
a8305467fb07f667c4aa1ba61a78ab3b3c0c23e1
45.919472
mit
0
6
true
true
true
true
2024-01-18T08:53:59Z
false
47.952218
74.61661
40.385889
38.910013
72.059984
1.592115
false
fierysurf_Kan-LLaMA-7B-SFT-v0.1-sharded_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/fierysurf/Kan-LLaMA-7B-SFT-v0.1-sharded" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fierysurf/Kan-LLaMA-7B-SFT-v0.1-sharded</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_fierysurf__Kan-LLaMA-7B-SFT-v0.1-sharded" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fierysurf/Kan-LLaMA-7B-SFT-v0.1-sharded
a04fd8b0958c11d7316965207d67b707cf4702f5
45.759932
mit
0
6
true
true
true
true
2024-01-18T08:49:09Z
false
45.904437
71.429994
40.856613
45.042657
68.823994
2.501895
false
fierysurf_Kan-LLaMA-7B-base_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/fierysurf/Kan-LLaMA-7B-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fierysurf/Kan-LLaMA-7B-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_fierysurf__Kan-LLaMA-7B-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fierysurf/Kan-LLaMA-7B-base
66ae057862e1201128113b4c8f3875c1a3fd8ef2
43.305684
mit
0
6
true
true
true
true
2024-01-18T08:45:47Z
false
43.94198
70.752838
37.056254
39.574747
68.508287
0
false
fionazhang_fine-tune-mistral-environment-merge_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/fionazhang/fine-tune-mistral-environment-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fionazhang/fine-tune-mistral-environment-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_fionazhang__fine-tune-mistral-environment-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fionazhang/fine-tune-mistral-environment-merge
162b38e3aea3c55fef316ab7f42af3af3a440c07
61.386727
0
7
false
true
true
true
2024-01-29T00:21:40Z
false
62.627986
83.658634
63.879078
43.974086
78.926598
35.25398
false
fionazhang_fine-tune-mistral-long-merge_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/fionazhang/fine-tune-mistral-long-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fionazhang/fine-tune-mistral-long-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_fionazhang__fine-tune-mistral-long-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fionazhang/fine-tune-mistral-long-merge
2675e1e670ebe54c733ed27fb32d8610644eefca
61.474243
0
7
false
true
true
true
2024-01-30T00:28:03Z
false
62.883959
83.618801
63.392422
43.935732
78.926598
36.087945
false
fionazhang_mistral-environment-adapter_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/fionazhang/mistral-environment-adapter" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fionazhang/mistral-environment-adapter</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_fionazhang__mistral-environment-adapter" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fionazhang/mistral-environment-adapter
28910193dcfc67b615e918c6cd90162b9ef12446
29.926766
apache-2.0
0
7
true
true
true
true
2024-01-16T00:48:34Z
false
29.180887
25.811591
25.383131
48.750889
50.434096
0
false
fionazhang_mistral-environment-all_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/fionazhang/mistral-environment-all" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fionazhang/mistral-environment-all</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_fionazhang__mistral-environment-all" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fionazhang/mistral-environment-all
ba2832b0dbd70860408d7786026549407c951a8a
29.177041
apache-2.0
0
7
true
true
true
true
2024-01-16T03:59:09Z
false
29.43686
25.891257
23.116858
47.919563
48.697711
0
false
fionazhang_mistral-experiment-6_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/fionazhang/mistral-experiment-6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fionazhang/mistral-experiment-6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_fionazhang__mistral-experiment-6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fionazhang/mistral-experiment-6
df18562607b2ba0fc296da17c398b9d3451c6a89
55.75193
apache-2.0
0
0
true
true
true
true
2024-01-28T23:12:55Z
false
55.802048
81.447919
55.565549
45.685896
73.796369
22.213798
false
fionazhang_mistral-experiment-6-merge_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/fionazhang/mistral-experiment-6-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fionazhang/mistral-experiment-6-merge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_fionazhang__mistral-experiment-6-merge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fionazhang/mistral-experiment-6-merge
2a6525f8b5c6d02ef78e716ccb37c6ef1bb1a26d
62.102591
apache-2.0
0
7
true
true
true
true
2024-01-28T23:12:16Z
false
63.822526
84.246166
62.912634
44.989085
77.979479
38.665656
false
fireballoon_baichuan-vicuna-chinese-7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/fireballoon/baichuan-vicuna-chinese-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fireballoon/baichuan-vicuna-chinese-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_fireballoon__baichuan-vicuna-chinese-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fireballoon/baichuan-vicuna-chinese-7b
6cdb9e75cd473e31e87067c2a0b646083247d9ab
46.057851
null
62
7
false
true
true
true
2023-09-09T10:52:17Z
false
43.515358
71.121291
46.874779
42.450352
66.850829
5.534496
false
flammenai_Mahou-1.0-llama3-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/Mahou-1.0-llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/Mahou-1.0-llama3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flammenai__Mahou-1.0-llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/Mahou-1.0-llama3-8B
76fe3f7ce2bfc8cb908298828cf4aed8b2006514
72.091694
other
1
8
true
true
true
true
2024-05-10T12:12:22Z
false
69.453925
84.953197
68.482282
59.04739
78.058406
72.554966
false
flammenai_Mahou-1.0-mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/Mahou-1.0-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/Mahou-1.0-mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flammenai__Mahou-1.0-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/Mahou-1.0-mistral-7B
2b0b54705eb18c0b61308e4f2eb2d7c3c6bdd421
72.641018
apache-2.0
1
7
true
true
true
true
2024-05-10T12:12:49Z
false
67.406143
86.51663
64.156869
72.417717
81.057616
64.29113
false
flammenai_Mahou-1.1-llama3-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/Mahou-1.1-llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/Mahou-1.1-llama3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flammenai__Mahou-1.1-llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/Mahou-1.1-llama3-8B
94f45a19632908df9e17dfa8cac5dcd15142b50e
72.285216
other
3
8
true
true
true
true
2024-05-10T12:13:27Z
false
70.221843
85.640311
68.464035
60.430332
78.295185
70.659591
false
flammenai_Mahou-1.1-mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/Mahou-1.1-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/Mahou-1.1-mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flammenai__Mahou-1.1-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/Mahou-1.1-mistral-7B
c94da8287e7a22bec338cad7cabb61cabb5e4a27
71.825473
apache-2.0
1
7
true
true
true
true
2024-05-10T12:13:10Z
false
66.723549
86.646086
64.281075
72.703699
79.794791
60.803639
false
flammenai_Mahou-1.2-llama3-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/Mahou-1.2-llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/Mahou-1.2-llama3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flammenai__Mahou-1.2-llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/Mahou-1.2-llama3-8B
92a7dff9afdc8d5db8f44b28e7e873bb0933d7c4
72.19136
llama3
8
8
true
true
true
true
2024-05-18T17:18:54Z
false
69.795222
84.654451
68.427289
60.501124
77.821626
71.948446
false
flammenai_Mahou-1.2-mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/Mahou-1.2-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/Mahou-1.2-mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flammenai__Mahou-1.2-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/Mahou-1.2-mistral-7B
1a8b507ab4afb22f364bdbd3b94c3ecbd34aa008
71.440992
apache-2.0
1
7
true
true
true
true
2024-05-18T17:19:21Z
false
67.406143
86.984664
64.559346
71.327119
79.005525
59.363154
false
flammenai_Mahou-1.2a-llama3-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/Mahou-1.2a-llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/Mahou-1.2a-llama3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flammenai__Mahou-1.2a-llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/Mahou-1.2a-llama3-8B
3318b6f5f1839644bee287a3e5390f3e9f565a9e
71.466571
llama3
5
8
true
true
true
true
2024-05-29T00:45:23Z
false
68.856655
83.967337
68.515083
58.761025
77.584846
71.114481
false
flammenai_Mahou-1.2a-mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/Mahou-1.2a-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/Mahou-1.2a-mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flammenai__Mahou-1.2a-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/Mahou-1.2a-mistral-7B
2870c71f924f71314167f7df163bc0b3a53cd4f7
72.345535
apache-2.0
6
7
true
true
true
true
2024-05-18T17:20:24Z
false
68.771331
88.139813
64.734481
68.708294
80.110497
63.608795
false
flammenai_Mahou-1.3-llama3-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/Mahou-1.3-llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/Mahou-1.3-llama3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flammenai__Mahou-1.3-llama3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/Mahou-1.3-llama3-8B
fbcd3b98d76ffa0b589085b2361d2b993bea1e9f
72.59323
llama3
2
8
true
true
true
true
2024-05-29T00:45:38Z
false
69.197952
84.853615
68.678904
61.047197
78.847672
72.934041
false
flammenai_Mahou-1.3-mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/Mahou-1.3-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/Mahou-1.3-mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flammenai__Mahou-1.3-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/Mahou-1.3-mistral-7B
3efc0d5e9d8f873912eeed2b1ebb20670d2edd23
71.268351
apache-2.0
2
7
true
true
true
true
2024-05-29T00:45:53Z
false
70.051195
87.293368
64.569172
67.590504
80.031571
58.074299
false
flammenai_flammen18X-mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/flammen18X-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/flammen18X-mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flammenai__flammen18X-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/flammen18X-mistral-7B
a5bb8eb287b783d651fa2552fd67e83d8a2f17e9
75.495171
apache-2.0
4
7
true
true
true
true
2024-04-18T03:52:29Z
false
72.440273
88.677554
64.712831
74.413807
83.583268
69.14329
false
flammenai_flammen20-mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/flammen20-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/flammen20-mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flammenai__flammen20-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/flammen20-mistral-7B
576779275198acb80c088799fc9902e7557537ec
73.081514
apache-2.0
1
7
true
true
true
true
2024-04-28T16:35:24Z
false
70.56314
86.327425
64.334931
68.753529
82.399369
66.11069
false
flammenai_flammen21X-mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/flammen21X-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/flammen21X-mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flammenai__flammen21X-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/flammen21X-mistral-7B
a4daca77b530ac5569267c649015de32e39c7518
73.094973
apache-2.0
1
7
true
true
true
true
2024-04-28T16:36:00Z
false
70.56314
86.347341
64.168044
68.456773
82.241515
66.793025
false
flammenai_flammen22-mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/flammen22-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/flammen22-mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flammenai__flammen22-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/flammen22-mistral-7B
66787ca37aac18d7b95058ed8071ea6f955e1ea2
73.113016
apache-2.0
0
7
true
true
true
true
2024-04-28T16:36:26Z
false
70.733788
86.29755
64.051667
68.178365
82.320442
67.096285
false
flammenai_flammen22C-mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/flammen22C-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/flammen22C-mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flammenai__flammen22C-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/flammen22C-mistral-7B
a29dec690725f4455a098aacf8272ae28a9db61b
73.018893
apache-2.0
0
7
true
true
true
true
2024-04-28T16:36:49Z
false
70.56314
86.327425
63.938153
68.556475
82.162589
66.56558
false
flammenai_flammen23-mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/flammen23-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/flammen23-mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flammenai__flammen23-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/flammen23-mistral-7B
64a695eb18543aae846dba0d51f99a70cd604807
72.525004
apache-2.0
0
7
true
true
true
true
2024-05-10T12:11:05Z
false
68.088737
86.685919
65.137262
68.827948
81.057616
65.35254
false
flammenai_flammen23X-mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/flammen23X-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/flammen23X-mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flammenai__flammen23X-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/flammen23X-mistral-7B
99721494c6607bcc0b537a696cb3dea7c6d7bda6
72.56414
apache-2.0
1
7
true
true
true
true
2024-05-10T12:11:34Z
false
68.430034
86.596296
65.049462
68.595629
81.057616
65.6558
false
flammenai_flammen24-mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/flammen24-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/flammen24-mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flammenai__flammen24-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/flammen24-mistral-7B
e4c1f5cd30ea61b74cdaaa5d887208547945318c
73.377755
apache-2.0
0
7
true
true
true
true
2024-05-10T12:11:56Z
false
68.515358
86.84525
64.62352
73.505618
80.74191
66.034875
false
flammenai_flammen26-mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/flammen26-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/flammen26-mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flammenai__flammen26-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/flammen26-mistral-7B
8a0ab6a208ceb4e472c81867fe22e99d26d57c98
72.486907
apache-2.0
2
7
true
false
true
true
2024-05-18T17:22:34Z
false
67.918089
86.964748
65.013347
71.366221
80.50513
63.153904
false
flammenai_flammen27-mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/flammen27-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/flammen27-mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flammenai__flammen27-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/flammen27-mistral-7B
b051fc0ffed8a44ac8506d2c93fbe244f4c4a98d
73.372769
apache-2.0
1
7
true
false
true
true
2024-05-18T17:22:58Z
false
69.795222
87.39295
65.009684
68.874372
81.689029
67.47536
false
flammenai_flammen29-mistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flammenai/flammen29-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flammenai/flammen29-mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flammenai__flammen29-mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flammenai/flammen29-mistral-7B
4191c7d099cb916a68458b4985f955fb5d0ad7da
72.468569
apache-2.0
1
7
true
true
true
true
2024-05-29T13:31:00Z
false
69.197952
86.347341
64.794149
65.841396
81.610103
67.02047
false
flemmingmiguel_DareBeagle-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flemmingmiguel/DareBeagle-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flemmingmiguel/DareBeagle-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__DareBeagle-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flemmingmiguel/DareBeagle-7B
53e5b634de4ae9ef8a127c1d7a0c543acfba1b47
74.353522
apache-2.0
1
7
true
false
true
true
2024-01-16T18:41:22Z
false
71.587031
87.980482
65.2135
68.299831
81.925809
71.114481
false
flemmingmiguel_Distilled-HermesChat-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flemmingmiguel/Distilled-HermesChat-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flemmingmiguel/Distilled-HermesChat-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__Distilled-HermesChat-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flemmingmiguel/Distilled-HermesChat-7B
e7ca19cecb52c40f0f6bb31cfa258fad0c004dfa
70.021017
apache-2.0
0
7
true
false
true
true
2024-01-12T04:31:32Z
false
67.491468
85.212109
65.2173
54.771
80.110497
67.32373
false
flemmingmiguel_MBX-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flemmingmiguel/MBX-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flemmingmiguel/MBX-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__MBX-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flemmingmiguel/MBX-7B
2270125929da3aa44594f7d0f82ac142cbdc38c9
75.036772
apache-2.0
2
7
true
false
true
true
2024-01-21T19:44:59Z
false
72.866894
88.378809
64.934081
69.112544
83.662194
71.266111
false
flemmingmiguel_MBX-7B-v2_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flemmingmiguel/MBX-7B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flemmingmiguel/MBX-7B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__MBX-7B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flemmingmiguel/MBX-7B-v2
1e8604ec6f544415814c68ef0b9666393567e7dd
75.241431
apache-2.0
1
7
true
false
true
true
2024-01-29T00:18:43Z
false
73.549488
88.498307
64.780677
70.213181
83.898974
70.507961
false
flemmingmiguel_MBX-7B-v3_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flemmingmiguel/MBX-7B-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flemmingmiguel/MBX-7B-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__MBX-7B-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flemmingmiguel/MBX-7B-v3
ca8c55fbbb2a0f7dd0de41579d98bbf24946b712
75.969439
apache-2.0
7
7
true
false
true
true
2024-01-28T22:30:14Z
false
74.146758
88.906592
65.058653
71.867684
85.556433
70.280516
false
flemmingmiguel_MDBX-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flemmingmiguel/MDBX-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flemmingmiguel/MDBX-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__MDBX-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flemmingmiguel/MDBX-7B
668b959981253f45ba25e6cb21289e136844f859
74.860693
apache-2.0
0
7
true
false
true
true
2024-01-21T06:42:18Z
false
72.013652
88.309102
64.974054
68.187122
83.504341
72.175891
false
flemmingmiguel_MarcMistral-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flemmingmiguel/MarcMistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flemmingmiguel/MarcMistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__MarcMistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flemmingmiguel/MarcMistral-7B
4571c6a5382eedacb74a51d1dfb0a6f378becc86
73.814102
apache-2.0
0
7
true
false
true
true
2024-01-16T21:28:32Z
false
71.16041
87.781318
65.381526
64.923882
81.689029
71.948446
false
flemmingmiguel_Mistrality-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flemmingmiguel/Mistrality-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flemmingmiguel/Mistrality-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__Mistrality-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flemmingmiguel/Mistrality-7B
05e7408486426ab8c8ed595945454eb181ba6eb0
69.97198
apache-2.0
0
7
true
false
true
true
2024-01-11T08:35:42Z
false
66.552901
85.819558
64.625191
56.795787
79.321231
66.71721
false
flemmingmiguel_NeuDist-Ro-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/flemmingmiguel/NeuDist-Ro-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">flemmingmiguel/NeuDist-Ro-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__NeuDist-Ro-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
flemmingmiguel/NeuDist-Ro-7B
c48a29d5543deb8ab9afb4dec0eb0c1a47f2c222
73.639988
apache-2.0
1
7
true
false
true
true
2024-01-12T01:23:47Z
false
71.245734
87.482573
65.134877
64.930233
82.083662
70.962851
false
formulae_Dorflan_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/formulae/Dorflan" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">formulae/Dorflan</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_formulae__Dorflan" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
formulae/Dorflan
5d8e7e5764ace89e6ccd1deece33b0e8a4b4587b
50.956557
0
6
false
true
true
true
2023-10-16T12:48:18Z
false
54.43686
75.781717
51.361321
51.167899
72.61247
0.379075
false
fradinho_llama-mistral_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/fradinho/llama-mistral" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fradinho/llama-mistral</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_fradinho__llama-mistral" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fradinho/llama-mistral
cc9f6b0bf334501e5bf04d70ead804e4fa31aa38
29.478447
mit
0
7
true
false
true
true
2024-04-10T20:40:28Z
false
27.986348
26.030671
24.91993
48.525679
49.408051
0
false
frank098_Wizard-Vicuna-13B-juniper_float16
float16
?
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/frank098/Wizard-Vicuna-13B-juniper" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">frank098/Wizard-Vicuna-13B-juniper</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_frank098__Wizard-Vicuna-13B-juniper" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
frank098/Wizard-Vicuna-13B-juniper
24f58beb9ed4cf635fc962853ed71d0f4b1909ba
52.550958
0
13
false
true
true
true
2023-09-09T10:52:17Z
false
55.887372
79.745071
44.988342
54.715325
72.691397
7.278241
false
frank098_WizardLM_13B_juniper_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/frank098/WizardLM_13B_juniper" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">frank098/WizardLM_13B_juniper</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_frank098__WizardLM_13B_juniper" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
frank098/WizardLM_13B_juniper
2204970fc0d96b071e2b1b003fbc5c87cfc46840
51.445632
0
13
false
true
true
true
2023-10-16T13:27:38Z
false
55.375427
77.195778
45.457428
51.4959
71.112865
8.036391
false
frank098_orca_mini_3b_juniper_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/frank098/orca_mini_3b_juniper" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">frank098/orca_mini_3b_juniper</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_frank098__orca_mini_3b_juniper" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
frank098/orca_mini_3b_juniper
c08749034baa053834f1b709b6e7b88b914cd1fb
38.830667
1
3
false
true
true
true
2023-09-09T10:52:17Z
false
40.870307
61.730731
26.366926
43.185409
60.299921
0.530705
false
frankenmerger_MiniLlama-1.8b-Chat-v0.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/frankenmerger/MiniLlama-1.8b-Chat-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">frankenmerger/MiniLlama-1.8b-Chat-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_frankenmerger__MiniLlama-1.8b-Chat-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
frankenmerger/MiniLlama-1.8b-Chat-v0.1
a880960de7a6340e68ebd92004430eaee3a6890b
37.366308
apache-2.0
1
1
true
true
true
true
2024-03-21T18:08:33Z
false
34.726962
62.378012
25.688605
38.97219
60.536701
1.895375
false
frankenmerger_cosmo-3b-test_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/frankenmerger/cosmo-3b-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">frankenmerger/cosmo-3b-test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_frankenmerger__cosmo-3b-test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
frankenmerger/cosmo-3b-test
bb3e1b70079ea2d17c23171d01189e09fe6712c5
34.936926
apache-2.0
0
2
true
true
true
true
2024-03-10T09:50:11Z
false
35.324232
52.360088
27.250941
39.020128
54.3015
1.36467
false
frankenmerger_cosmo-3b-test-v0.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/frankenmerger/cosmo-3b-test-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">frankenmerger/cosmo-3b-test-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_frankenmerger__cosmo-3b-test-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
frankenmerger/cosmo-3b-test-v0.2
544e8b53e20aa379415ba12ecd1616d2a894672d
34.700449
apache-2.0
0
2
true
true
true
true
2024-03-21T18:57:29Z
false
35.324232
51.702848
27.3284
38.818679
53.512234
1.5163
false
frankenmerger_delta-4B-scientific_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/frankenmerger/delta-4B-scientific" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">frankenmerger/delta-4B-scientific</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_frankenmerger__delta-4B-scientific" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
frankenmerger/delta-4B-scientific
ec54bb8cac88216c172e941c3adeeb8e1992f1f2
60.406957
0
4
false
true
true
true
2024-03-10T17:30:54Z
false
59.385666
74.098785
57.560721
48.388058
75.927388
47.081122
false
frankenmerger_delta-4B-super_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/frankenmerger/delta-4B-super" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">frankenmerger/delta-4B-super</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_frankenmerger__delta-4B-super" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
frankenmerger/delta-4B-super
680f13a7d44182d799a826c52f3929590f5fd4d6
61.04489
apache-2.0
0
4
false
true
true
true
2024-03-10T09:43:41Z
false
58.617747
76.289584
59.058081
51.735917
73.638516
46.929492
false
frankenmerger_delta-4b-orange_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/frankenmerger/delta-4b-orange" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">frankenmerger/delta-4b-orange</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_frankenmerger__delta-4b-orange" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
frankenmerger/delta-4b-orange
b701c3329f7ecb6cafe7f38b27f59eea548a9c92
62.233207
apache-2.0
0
4
true
true
true
true
2024-03-10T09:45:26Z
false
58.87372
76.588329
56.495926
56.818864
76.479874
48.142532
false
frankenmerger_gemoy-4b-instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/frankenmerger/gemoy-4b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">frankenmerger/gemoy-4b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_frankenmerger__gemoy-4b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
frankenmerger/gemoy-4b-instruct
18dae1ff443a44fa20b40b21044a6601b6544d56
40.202905
0
4
false
true
true
true
2024-03-10T09:44:44Z
false
40.699659
58.02629
36.418585
46.641168
59.431728
0
false
frankenmerger_gemoy-4b-instruct-scientific_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/frankenmerger/gemoy-4b-instruct-scientific" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">frankenmerger/gemoy-4b-instruct-scientific</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_frankenmerger__gemoy-4b-instruct-scientific" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
frankenmerger/gemoy-4b-instruct-scientific
2fd6773d400afdcc4bfce6cefd32551e4087ea69
44.039863
0
4
false
true
true
true
2024-03-10T09:42:18Z
false
41.979522
63.04521
38.726209
41.959623
63.062352
15.466262
false
freeCS-dot-org_OpenAGI-testing-intelDPO-2_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/freeCS-dot-org/OpenAGI-testing-intelDPO-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freeCS-dot-org/OpenAGI-testing-intelDPO-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_freeCS-dot-org__OpenAGI-testing-intelDPO-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freeCS-dot-org/OpenAGI-testing-intelDPO-2
d160d65b1155a68c70ed75838c2bdc7f5ce511e8
66.36004
apache-2.0
0
7
true
true
true
true
2024-02-17T09:14:45Z
false
62.798635
84.634535
62.648519
58.283192
78.847672
50.947688
false
freeCS-dot-org_OpenAGI-testing-truthyDPO-1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/freeCS-dot-org/OpenAGI-testing-truthyDPO-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freeCS-dot-org/OpenAGI-testing-truthyDPO-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_freeCS-dot-org__OpenAGI-testing-truthyDPO-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freeCS-dot-org/OpenAGI-testing-truthyDPO-1
07fc27e045d1388a9e0afb3bc12ac595c8cb34be
67.640939
apache-2.0
0
7
true
true
true
true
2024-02-17T09:13:06Z
false
67.320819
85.988847
63.122244
71.124715
81.21547
37.073541
false
freeCS-dot-org_ThetaWave-7B-v0.1_float16
float16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/freeCS-dot-org/ThetaWave-7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freeCS-dot-org/ThetaWave-7B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_freeCS-dot-org__ThetaWave-7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freeCS-dot-org/ThetaWave-7B-v0.1
f73322bf5c95ba61e9e72efdf930ec67055ecf57
70.489937
0
7
false
true
true
true
2024-01-22T18:54:55Z
false
68.088737
86.327425
62.111282
71.679507
79.084451
55.648218
false
freeCS-dot-org_ThetaZero-7B-1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/freeCS-dot-org/ThetaZero-7B-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freeCS-dot-org/ThetaZero-7B-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_freeCS-dot-org__ThetaZero-7B-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freeCS-dot-org/ThetaZero-7B-1
5af7656feb7c0f4f33aaca6984b4600c511613f2
69.073672
0
7
false
true
true
true
2024-01-21T16:18:13Z
false
67.491468
85.690102
63.028553
62.482528
79.873717
55.875663
false
freeCS-dot-org_Zero-7B-test-3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/freeCS-dot-org/Zero-7B-test-3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freeCS-dot-org/Zero-7B-test-3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_freeCS-dot-org__Zero-7B-test-3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freeCS-dot-org/Zero-7B-test-3
ddcd86b0ef66dd8b7d7b9418b88f3fbc1cfdc828
58.768345
0
7
false
true
true
true
2024-01-21T10:29:16Z
false
64.249147
79.854611
53.485867
58.304189
76.322021
20.394238
false
freecs_Llama-3-7b_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/freecs/Llama-3-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freecs/Llama-3-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_freecs__Llama-3-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freecs/Llama-3-7b
778db38d13be6ed3384fa049114a95d56cf420d3
35.603294
0
6
false
true
true
true
2024-01-16T20:39:24Z
false
34.641638
56.393149
24.506924
38.030469
59.668508
0.379075
false
freecs_ThetaWave-14B-v0.1_float16
float16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/freecs/ThetaWave-14B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freecs/ThetaWave-14B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_freecs__ThetaWave-14B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freecs/ThetaWave-14B-v0.1
9e9745166b6f4e125511739d06900e72e5859617
44.535792
0
14
false
true
true
true
2024-01-28T11:00:27Z
false
42.832765
47.092213
61.450175
50.409447
65.43015
0
false
freecs_ThetaWave-28B-v0.1_float16
float16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/freecs/ThetaWave-28B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freecs/ThetaWave-28B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_freecs__ThetaWave-28B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freecs/ThetaWave-28B-v0.1
9efeb3784333a072be4db0b6e413e319327d89e5
40.401695
0
28
false
true
true
true
2024-01-28T11:01:18Z
false
36.604096
35.540729
54.497596
49.864042
65.90371
0
false
freecs_ThetaWave-7B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/freecs/ThetaWave-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freecs/ThetaWave-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_freecs__ThetaWave-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freecs/ThetaWave-7B
21a306726dae52eee662b83fadc9657cef10dd02
69.353625
0
7
false
true
true
true
2024-01-17T18:39:50Z
false
67.491468
86.008763
62.257403
65.255485
79.005525
56.103108
false
freecs_ThetaWave-7B-v0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/freecs/ThetaWave-7B-v0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freecs/ThetaWave-7B-v0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_freecs__ThetaWave-7B-v0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freecs/ThetaWave-7B-v0
89c74880ff1621a555374b2867f564131b3f4352
68.491535
0
7
false
true
true
true
2024-01-18T20:35:25Z
false
68.515358
85.351524
61.06986
61.56128
79.636938
54.814253
false
freecs_ThetaWave-7B-v0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/freecs/ThetaWave-7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freecs/ThetaWave-7B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_freecs__ThetaWave-7B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freecs/ThetaWave-7B-v0.1
c2aea352e9697d0bbeb4e3e469f71ba691625c00
69.172922
apache-2.0
3
7
false
true
true
true
2024-01-24T06:25:23Z
false
66.296928
85.401314
63.467762
60.243134
80.189424
59.438969
false
freecs_ThetaWave-7B-v0.2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/freecs/ThetaWave-7B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freecs/ThetaWave-7B-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_freecs__ThetaWave-7B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freecs/ThetaWave-7B-v0.2
308462cc42873575ddd847ab7941304b6d441c2f
67.377501
apache-2.0
0
7
true
true
true
true
2024-02-23T20:55:32Z
false
64.505119
85.002987
61.00604
59.952021
82.320442
51.478393
false
freecs_ThetaWave-7B-v1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/freecs/ThetaWave-7B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freecs/ThetaWave-7B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_freecs__ThetaWave-7B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freecs/ThetaWave-7B-v1
7cad16a292a7b96d671e20dad3609d03814149d7
67.084864
0
7
false
true
true
true
2024-01-19T20:28:14Z
false
66.894198
84.913364
61.62165
55.962334
80.426204
52.691433
false
freecs_Tiny-Llama-3-7b_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/freecs/Tiny-Llama-3-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freecs/Tiny-Llama-3-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_freecs__Tiny-Llama-3-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freecs/Tiny-Llama-3-7b
778db38d13be6ed3384fa049114a95d56cf420d3
35.603294
0
6
false
true
true
true
2024-01-16T23:09:44Z
false
34.641638
56.393149
24.506924
38.030469
59.668508
0.379075
false
freecs_Zero-7B-test-1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/freecs/Zero-7B-test-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freecs/Zero-7B-test-1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_freecs__Zero-7B-test-1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freecs/Zero-7B-test-1
6da901880f66d738a6899f65a881c46a49db51b7
67.832514
0
7
false
true
true
true
2024-01-20T15:50:32Z
false
66.12628
84.624577
63.121689
58.974608
79.636938
54.510993
false
freecs_Zero-7B-test-2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/freecs/Zero-7B-test-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freecs/Zero-7B-test-2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_freecs__Zero-7B-test-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freecs/Zero-7B-test-2
f84d973ccd63d8380994ce83a49b16ba7b4034db
67.910953
0
7
false
true
true
true
2024-01-20T15:50:39Z
false
66.12628
84.773949
62.979402
59.953305
80.031571
53.601213
false
freewheelin_free-evo-qwen72b-v0.8_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/freewheelin/free-evo-qwen72b-v0.8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freewheelin/free-evo-qwen72b-v0.8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_freewheelin__free-evo-qwen72b-v0.8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freewheelin/free-evo-qwen72b-v0.8
7169478b57edff434bd943be28415ea9fc2cf1e0
81.283872
0
72
false
true
true
true
2024-05-03T03:54:54Z
false
79.863481
91.336387
77.999839
74.846322
87.766377
75.890826
false
freewheelin_free-evo-qwen72b-v0.8-re_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/freewheelin/free-evo-qwen72b-v0.8-re" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freewheelin/free-evo-qwen72b-v0.8-re</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_freewheelin__free-evo-qwen72b-v0.8-re" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freewheelin/free-evo-qwen72b-v0.8-re
df20836951a07c52d4aacc668fca3143429d485c
81.283872
mit
4
72
true
false
true
true
2024-05-05T07:26:59Z
false
79.863481
91.336387
77.999839
74.846322
87.766377
75.890826
false
freewheelin_free-llama3-dpo-v0.2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/freewheelin/free-llama3-dpo-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">freewheelin/free-llama3-dpo-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_freewheelin__free-llama3-dpo-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
freewheelin/free-llama3-dpo-v0.2
2caf1189046172cce115824313971aee5a429df3
62.685726
mit
0
8
true
true
true
true
2024-05-28T07:35:57Z
false
59.897611
81.87612
66.587809
45.828632
77.26914
44.655042
false
froggeric_WestLake-10.7B-v2_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/froggeric/WestLake-10.7B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">froggeric/WestLake-10.7B-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_froggeric__WestLake-10.7B-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
froggeric/WestLake-10.7B-v2
de1f0f286ef6d5a6e10627ac05f8cfb9baaa36a5
70.275675
apache-2.0
20
10
true
false
true
true
2024-03-24T20:09:21Z
false
71.16041
87.930691
63.811692
64.906887
85.398579
48.445792
false
fzzhang_Marcoroni-neural-chat-7B-v2_gsm8k_merged_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged
dfabf300a516c8a8695bc62784c2b0bc2db7242b
68.125366
apache-2.0
0
7
true
true
true
true
2024-02-16T07:33:39Z
false
65.784983
85.2619
64.261398
53.182972
78.926598
61.334344
false
fzzhang_Marcoroni-neural-chat-7B-v2_gsm8k_merged_s_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged_s" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged_s</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_merged_s" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_merged_s
c6ab98c227ff5c2e284571ed1a8c21c0f9db1a55
71.370324
apache-2.0
0
7
true
true
true
true
2024-02-16T20:55:28Z
false
67.150171
85.680143
62.715957
63.292037
79.558011
69.825625
false
fzzhang_Marcoroni-neural-chat-7B-v2_gsm8k_quantized_mergedfloat_s_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_quantized_mergedfloat_s" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_quantized_mergedfloat_s</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_fzzhang__Marcoroni-neural-chat-7B-v2_gsm8k_quantized_mergedfloat_s" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fzzhang/Marcoroni-neural-chat-7B-v2_gsm8k_quantized_mergedfloat_s
d1a30161bd58ed7506ad0ad22fea7f186e065776
66.244056
apache-2.0
0
7
true
true
true
true
2024-02-16T22:23:15Z
false
64.078498
84.11671
61.137376
54.771952
76.953433
56.406368
false
fzzhang_mistralv1_gsm8k_merged_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/fzzhang/mistralv1_gsm8k_merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fzzhang/mistralv1_gsm8k_merged</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_fzzhang__mistralv1_gsm8k_merged" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fzzhang/mistralv1_gsm8k_merged
b9cb1edd3a535cabc500ce9fb81d98bbfed0b047
62.275146
apache-2.0
0
7
true
true
true
true
2024-02-16T08:25:36Z
false
61.348123
83.110934
63.037175
39.552847
78.610892
47.990902
false
fzzhang_mistralv1_gsm8k_merged_s_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/fzzhang/mistralv1_gsm8k_merged_s" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fzzhang/mistralv1_gsm8k_merged_s</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_fzzhang__mistralv1_gsm8k_merged_s" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fzzhang/mistralv1_gsm8k_merged_s
d2c604a23f608864c60c8cd3de29ce9ff336e8e9
62.569522
apache-2.0
0
7
true
true
true
true
2024-02-16T20:44:22Z
false
62.030717
83.947421
61.661435
42.426146
77.663773
47.687642
false
fzzhang_toten_gsm8k_merged_s_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/fzzhang/toten_gsm8k_merged_s" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">fzzhang/toten_gsm8k_merged_s</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_fzzhang__toten_gsm8k_merged_s" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
fzzhang/toten_gsm8k_merged_s
34ed7e1f452179f5b551cae07d4b4e2ac15aac2c
67.472391
apache-2.0
0
7
true
true
true
true
2024-02-17T05:07:18Z
false
65.273038
84.704242
62.82545
54.919542
77.900552
59.211524
false
g-ronimo_Meta-Llama-3-8B-Instruct-LessResistant_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/g-ronimo/Meta-Llama-3-8B-Instruct-LessResistant" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">g-ronimo/Meta-Llama-3-8B-Instruct-LessResistant</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_g-ronimo__Meta-Llama-3-8B-Instruct-LessResistant" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
g-ronimo/Meta-Llama-3-8B-Instruct-LessResistant
966ce07d4c1e66c9b7487ad85925ed83bcd35c02
65.878864
other
3
8
true
true
true
true
2024-05-06T19:19:42Z
false
60.324232
78.739295
67.038192
45.843393
74.033149
69.29492
false
g-ronimo_llama3-8b-SlimHermes_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/g-ronimo/llama3-8b-SlimHermes" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">g-ronimo/llama3-8b-SlimHermes</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_g-ronimo__llama3-8b-SlimHermes" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
g-ronimo/llama3-8b-SlimHermes
54e5a3f83fdbcd33df2ad0a44492d25a2587156a
63.955594
other
0
8
true
true
true
true
2024-04-21T06:57:09Z
false
60.409556
82.852022
63.471317
54.705612
75.138122
47.156937
false
g-ronimo_phi-2-OpenHermes-2.5_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/g-ronimo/phi-2-OpenHermes-2.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">g-ronimo/phi-2-OpenHermes-2.5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_g-ronimo__phi-2-OpenHermes-2.5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
g-ronimo/phi-2-OpenHermes-2.5
ee382f2c6f1006d6854a1b3cc26cbaa28eeab2cb
58.376681
mit
10
2
false
true
true
true
2024-02-03T22:30:24Z
false
59.812287
74.845648
55.513828
43.861575
75.059195
41.167551
false
g-ronimo_phi-2-OpenHermes-2.5-v2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
PhiForCausalLM
<a target="_blank" href="https://huggingface.co/g-ronimo/phi-2-OpenHermes-2.5-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">g-ronimo/phi-2-OpenHermes-2.5-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_g-ronimo__phi-2-OpenHermes-2.5-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
g-ronimo/phi-2-OpenHermes-2.5-v2
246f56314bb9aada8d50267bc0764c07bdcd8b86
58.325729
mit
0
2
false
true
true
true
2024-03-09T14:37:00Z
false
58.447099
74.566819
56.426878
44.887129
75.217048
40.409401
false
gabifg_Grypho-ties-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/gabifg/Grypho-ties-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gabifg/Grypho-ties-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_gabifg__Grypho-ties-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gabifg/Grypho-ties-7b
55b29ca2f160ed04437d05906a2e4d30204c903d
68.874171
apache-2.0
0
7
true
false
true
true
2024-05-12T15:52:22Z
false
65.784983
85.899223
63.163939
62.032785
82.004736
54.359363
false
gagan3012_MetaModel_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gagan3012/MetaModel" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gagan3012/MetaModel</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_gagan3012__MetaModel" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gagan3012/MetaModel
06308e54585a49a01a93c99caa2fb34daf4e7619
74.400332
apache-2.0
0
10
true
false
true
true
2024-01-03T14:47:20Z
false
71.075085
88.448516
66.258659
71.841779
83.425414
65.35254
false
gagan3012_MetaModel_moe_bfloat16
bfloat16
🟢 pretrained
🟢
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/gagan3012/MetaModel_moe" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gagan3012/MetaModel_moe</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_gagan3012__MetaModel_moe" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gagan3012/MetaModel_moe
015dae67b68e6e5007b7b13a448886eb5f6bfea8
74.283081
apache-2.0
0
36
true
true
false
true
2024-01-05T23:46:16Z
false
71.075085
88.388767
66.314058
71.821847
83.504341
64.59439
false
gagan3012_MetaModel_moe_float16
float16
🟢 pretrained
🟢
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/gagan3012/MetaModel_moe" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gagan3012/MetaModel_moe</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_gagan3012__MetaModel_moe" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gagan3012/MetaModel_moe
015dae67b68e6e5007b7b13a448886eb5f6bfea8
74.424609
apache-2.0
0
36
true
true
false
true
2024-01-06T13:15:06Z
false
71.245734
88.398725
66.264434
71.863916
83.346488
65.428355
false
gagan3012_MetaModel_moe_multilingualv1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/gagan3012/MetaModel_moe_multilingualv1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gagan3012/MetaModel_moe_multilingualv1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_gagan3012__MetaModel_moe_multilingualv1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gagan3012/MetaModel_moe_multilingualv1
1b27a5aa3381f82ae99e8187bbd982e319eafd17
69.3275
apache-2.0
0
46
true
true
false
true
2024-01-07T15:01:29Z
false
67.576792
84.724159
63.770471
61.211168
77.348066
61.334344
false
gagan3012_MetaModel_moe_multilingualv1_float16
float16
🟢 pretrained
🟢
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/gagan3012/MetaModel_moe_multilingualv1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gagan3012/MetaModel_moe_multilingualv1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_gagan3012__MetaModel_moe_multilingualv1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gagan3012/MetaModel_moe_multilingualv1
1b27a5aa3381f82ae99e8187bbd982e319eafd17
69.089314
apache-2.0
0
46
true
true
false
true
2024-01-07T20:21:42Z
false
67.235495
84.734117
63.934313
61.229072
77.584846
59.818044
false
gagan3012_MetaModelv2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gagan3012/MetaModelv2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gagan3012/MetaModelv2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_gagan3012__MetaModelv2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gagan3012/MetaModelv2
2cb9c69984ee3e5506f055238fd1aa5fe4ea91bd
74.235658
apache-2.0
0
10
true
false
true
true
2024-01-03T15:33:46Z
false
71.075085
88.558056
66.289262
71.939076
83.109708
64.44276
false
gagan3012_MetaModelv3_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gagan3012/MetaModelv3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gagan3012/MetaModelv3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_gagan3012__MetaModelv3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gagan3012/MetaModelv3
862f5ca5e66a0b053c14e40c8f16f2c2807b6d92
74.392229
apache-2.0
0
10
true
false
true
true
2024-01-06T13:16:57Z
false
71.16041
88.388767
66.321193
71.859791
83.346488
65.276725
false
gagan3012_Multilingual-mistral_float16
float16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/gagan3012/Multilingual-mistral" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gagan3012/Multilingual-mistral</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_gagan3012__Multilingual-mistral" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gagan3012/Multilingual-mistral
692fa323156e1d2a81e43adc0dd032700dde7a1a
62.789959
apache-2.0
2
46
true
true
false
true
2024-01-16T07:43:27Z
false
62.286689
81.756622
61.377342
55.528574
75.532755
40.257771
false
gagan3012_Multirial_float16
float16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/gagan3012/Multirial" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gagan3012/Multirial</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_gagan3012__Multirial" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gagan3012/Multirial
0bf35a998ce26287916c9d1e0575d5f15e6ae0df
62.366426
apache-2.0
1
46
true
true
false
true
2024-01-13T13:49:49Z
false
63.225256
79.565824
61.005614
54.696484
75.295975
40.409401
false
gaodrew_OpenOrca-Platypus2-13B-thera-1250_4bit
4bit
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gaodrew/OpenOrca-Platypus2-13B-thera-1250" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gaodrew/OpenOrca-Platypus2-13B-thera-1250</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_gaodrew__OpenOrca-Platypus2-13B-thera-1250" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gaodrew/OpenOrca-Platypus2-13B-thera-1250
b1c2ebcda387211732e87911e39edca503502a33
54.559093
0
13
false
true
true
true
2023-10-16T12:54:17Z
false
59.215017
81.019717
57.039956
48.426741
73.08603
8.567096
false
gaodrew_gaodrew-gorgonzola-13b_4bit
4bit
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gaodrew/gaodrew-gorgonzola-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gaodrew/gaodrew-gorgonzola-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_gaodrew__gaodrew-gorgonzola-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gaodrew/gaodrew-gorgonzola-13b
a53fbe358d4cb546916847d861ccfaf7c724a103
53.695047
1
13
false
true
true
true
2023-10-16T12:48:18Z
false
50.938567
77.653854
68.926248
40.630687
75.453828
8.567096
false
gaodrew_gaodrew-gorgonzola-13b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gaodrew/gaodrew-gorgonzola-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gaodrew/gaodrew-gorgonzola-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_gaodrew__gaodrew-gorgonzola-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gaodrew/gaodrew-gorgonzola-13b
a53fbe358d4cb546916847d861ccfaf7c724a103
55.352772
1
13
false
true
true
true
2023-09-09T10:52:17Z
false
53.83959
78.858793
71.5379
42.576794
75.295975
10.007582
false
gaodrew_gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps_4bit
4bit
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/gaodrew/gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">gaodrew/gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_gaodrew__gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
gaodrew/gaodrew-llama-30b-instruct-2048-Open-Platypus-100steps
1114ff08ed15ef417502da58f0237d2f6650c9ce
59.222971
0
30
false
true
true
true
2023-09-09T10:52:17Z
false
61.518771
84.056961
60.230055
51.046304
80.820837
17.664898
false
garage-bAInd_Camel-Platypus2-13B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/garage-bAInd/Camel-Platypus2-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">garage-bAInd/Camel-Platypus2-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Camel-Platypus2-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
garage-bAInd/Camel-Platypus2-13B
0480a52799cb8e8de73bb41994df8b6b793937c7
54.319892
null
2
13
false
true
true
true
2023-09-09T10:52:17Z
false
60.750853
83.608843
56.512698
49.596242
75.374901
0.075815
false
garage-bAInd_Camel-Platypus2-70B_8bit
8bit
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/garage-bAInd/Camel-Platypus2-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">garage-bAInd/Camel-Platypus2-70B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_garage-bAInd__Camel-Platypus2-70B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
garage-bAInd/Camel-Platypus2-70B
6f958a1063fe1e6075f6e379fae621ff5a1d98c6
65.394009
cc-by-nc-4.0
14
68
true
true
true
true
2023-10-16T12:48:18Z
false
70.136519
87.711611
69.834216
57.77231
82.951855
23.957544
false