eval_name
stringlengths
9
97
Precision
stringclasses
5 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
3 values
Architecture
stringclasses
53 values
Model
stringlengths
355
611
fullname
stringlengths
4
89
Model sha
stringlengths
0
40
Average ⬆️
float64
27
81.3
Hub License
stringclasses
35 values
Hub ❤️
int64
0
4.88k
#Params (B)
int64
0
238
Available on the hub
bool
2 classes
Merged
bool
2 classes
MoE
bool
2 classes
Flagged
bool
1 class
date
stringlengths
0
26
Chat Template
bool
2 classes
ARC
float64
19.7
87.5
HellaSwag
float64
20.7
92.8
MMLU
float64
17.8
89.4
TruthfulQA
float64
27.9
82.3
Winogrande
float64
47.2
91.5
GSM8K
float64
0
88.2
Maintainers Choice
bool
2 classes
bigcode_starcoderbase-1b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPTBigCodeForCausalLM
<a target="_blank" href="https://huggingface.co/bigcode/starcoderbase-1b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigcode/starcoderbase-1b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode__starcoderbase-1b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bigcode/starcoderbase-1b
182f0165fdf8da9c9935901eec65c94337f01c11
30.055941
bigcode-openrail-m
56
1
true
true
true
true
2024-02-14T20:23:36Z
false
22.696246
34.305915
26.673882
45.789287
49.960537
0.90978
true
bigcode_starcoderbase-3b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPTBigCodeForCausalLM
<a target="_blank" href="https://huggingface.co/bigcode/starcoderbase-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigcode/starcoderbase-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode__starcoderbase-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bigcode/starcoderbase-3b
e1c5ef4ebb97afa0db09ec3e520f0487ca350bbe
31.375749
bigcode-openrail-m
19
3
true
true
true
true
2024-02-14T20:23:43Z
false
25.853242
39.105756
27.3528
43.054512
51.144436
1.743745
true
bigcode_starcoderbase-7b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPTBigCodeForCausalLM
<a target="_blank" href="https://huggingface.co/bigcode/starcoderbase-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigcode/starcoderbase-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode__starcoderbase-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bigcode/starcoderbase-7b
4ab631381edb607557cbb04b6e9a225bad16807c
33.746237
bigcode-openrail-m
31
7
true
true
true
true
2024-02-14T20:23:50Z
false
29.863481
43.865764
28.446438
40.462634
54.380426
5.458681
true
bigcode_starcoderplus_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTBigCodeForCausalLM
<a target="_blank" href="https://huggingface.co/bigcode/starcoderplus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigcode/starcoderplus</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode__starcoderplus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bigcode/starcoderplus
95be82087c33f14ee9941c812a154a9dd66efe72
47.606026
null
214
0
true
true
true
true
2023-09-09T10:52:17Z
false
48.720137
77.295359
43.721979
37.854396
70.007893
8.036391
true
bigcode_tiny_starcoder_py_float16
float16
🟢 pretrained
🟢
Original
GPTBigCodeForCausalLM
<a target="_blank" href="https://huggingface.co/bigcode/tiny_starcoder_py" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigcode/tiny_starcoder_py</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bigcode__tiny_starcoder_py" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bigcode/tiny_starcoder_py
8547527bef0bc927268c1653cce6948c5c242dd1
29.406319
bigcode-openrail-m
71
0
true
true
true
true
2023-10-16T12:46:18Z
false
20.989761
28.769169
26.789375
47.680653
51.223362
0.985595
true
bigscience_bloom_float16
float16
🟢 pretrained
🟢
Original
BloomForCausalLM
<a target="_blank" href="https://huggingface.co/bigscience/bloom" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigscience/bloom</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloom" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bigscience/bloom
053d9cd9fbe814e091294f67fcfedb3397b954bb
46.068081
bigscience-bloom-rail-1.0
4,632
176
true
true
true
true
2023-08-25T13:05:50Z
false
50.426621
76.409082
30.854009
39.759623
72.059984
6.899166
false
bigscience_bloom-1b1_float16
float16
?
Original
BloomForCausalLM
<a target="_blank" href="https://huggingface.co/bigscience/bloom-1b1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigscience/bloom-1b1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloom-1b1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bigscience/bloom-1b1
6f4195539db0eef1c9d010289f32e0645d9a2354
32.474428
bigscience-bloom-rail-1.0
54
1
true
true
true
true
2023-09-09T10:52:17Z
false
28.327645
42.780323
26.702148
41.797167
55.011839
0.227445
false
bigscience_bloom-1b7_float16
float16
?
Original
BloomForCausalLM
<a target="_blank" href="https://huggingface.co/bigscience/bloom-1b7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigscience/bloom-1b7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloom-1b7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bigscience/bloom-1b7
cc72a88036c2fb937d65efeacc57a0c2ef5d6fe5
33.981809
bigscience-bloom-rail-1.0
116
1
true
true
true
true
2023-09-09T10:52:17Z
false
30.631399
47.60008
27.478465
41.309063
56.037885
0.833965
false
bigscience_bloom-3b_float16
float16
?
Original
BloomForCausalLM
<a target="_blank" href="https://huggingface.co/bigscience/bloom-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigscience/bloom-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloom-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bigscience/bloom-3b
52bc5b43010b4844513826b8be3f78c7344c37d7
36.07003
bigscience-bloom-rail-1.0
86
3
true
true
true
true
false
35.750853
54.371639
26.592509
40.572463
57.616417
1.5163
false
bigscience_bloom-560m_float16
float16
?
Original
BloomForCausalLM
<a target="_blank" href="https://huggingface.co/bigscience/bloom-560m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigscience/bloom-560m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloom-560m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bigscience/bloom-560m
4f42c91d806a19ae1a46af6c3fb5f4990d884cd6
30.132439
bigscience-bloom-rail-1.0
327
0
true
true
true
true
2023-09-09T10:52:17Z
false
24.744027
37.153953
24.215412
42.444282
51.933702
0.30326
false
bigscience_bloom-7b1_float16
float16
🟢 pretrained
🟢
Original
BloomForCausalLM
<a target="_blank" href="https://huggingface.co/bigscience/bloom-7b1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigscience/bloom-7b1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloom-7b1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bigscience/bloom-7b1
e83e90ba86f87f74aa2731cdab25ccf33976bd66
39.177461
bigscience-bloom-rail-1.0
187
7
true
true
true
true
2023-09-09T10:52:17Z
false
41.12628
61.999602
26.246222
38.897842
65.43015
1.36467
false
bigscience_bloomz-3b_float16
float16
?
Original
BloomForCausalLM
<a target="_blank" href="https://huggingface.co/bigscience/bloomz-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigscience/bloomz-3b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloomz-3b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bigscience/bloomz-3b
31eefcb2bcd69632925adf07e090debafe95436d
37.034425
bigscience-bloom-rail-1.0
74
3
true
true
true
true
2023-09-09T10:52:17Z
false
36.860068
54.949213
32.911415
40.342997
57.142857
0
false
bigscience_bloomz-560m_float16
float16
?
Original
BloomForCausalLM
<a target="_blank" href="https://huggingface.co/bigscience/bloomz-560m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigscience/bloomz-560m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloomz-560m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bigscience/bloomz-560m
a2845d7e13dd12efae154a9f1c63fcc2e0cc4b05
30.626734
bigscience-bloom-rail-1.0
94
0
true
true
true
true
2023-09-09T10:52:17Z
false
23.549488
36.307508
25.100438
45.685372
53.117601
0
false
bigscience_bloomz-7b1_float16
float16
?
Original
BloomForCausalLM
<a target="_blank" href="https://huggingface.co/bigscience/bloomz-7b1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigscience/bloomz-7b1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloomz-7b1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bigscience/bloomz-7b1
2f4c4f3ebcf171dbbe2bae989ea2d2f3d3486a97
42.21043
bigscience-bloom-rail-1.0
134
7
true
true
true
true
2023-10-16T12:48:18Z
false
42.491468
63.005377
37.846657
45.202381
64.640884
0.075815
false
bigscience_bloomz-7b1-mt_float16
float16
?
Original
BloomForCausalLM
<a target="_blank" href="https://huggingface.co/bigscience/bloomz-7b1-mt" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bigscience/bloomz-7b1-mt</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloomz-7b1-mt" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bigscience/bloomz-7b1-mt
76875e6ea8df98157fb032c48ad6e354fd6a077b
42.138039
bigscience-bloom-rail-1.0
134
7
true
true
true
true
2023-09-09T10:52:17Z
false
43.856655
62.905796
37.349098
45.654331
63.062352
0
false
binbi_Ein-72B-v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/binbi/Ein-72B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">binbi/Ein-72B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_binbi__Ein-72B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
binbi/Ein-72B-v0.1
84ec4c0fcefc5af86f649a70c9d3ff493334e868
80.990268
0
72
false
true
true
true
2024-02-04T01:01:02Z
false
76.450512
89.434376
77.139473
78.089426
84.767167
80.060652
false
binbi_Ein-72B-v0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/binbi/Ein-72B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">binbi/Ein-72B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_binbi__Ein-72B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
binbi/Ein-72B-v0.1
84ec4c0fcefc5af86f649a70c9d3ff493334e868
80.790504
0
72
false
true
true
true
2024-02-04T00:59:30Z
false
76.535836
89.195379
77.107293
78.469371
84.056827
79.378317
false
binbi_MoMo-70B-V1.2_1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/binbi/MoMo-70B-V1.2_1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">binbi/MoMo-70B-V1.2_1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_binbi__MoMo-70B-V1.2_1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
binbi/MoMo-70B-V1.2_1
45056003b42a1cb5a6b2a0f338f85ec925a0587b
71.34417
0
68
false
true
true
true
2024-01-22T23:10:26Z
false
70.904437
86.466839
69.948117
61.305363
83.109708
56.330553
false
binbi_SF-72B-V1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/binbi/SF-72B-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">binbi/SF-72B-V1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_binbi__SF-72B-V1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
binbi/SF-72B-V1
39e00bb5cbebecb7b62f3b696423127e6ca5283b
28.753217
0
72
false
true
true
true
2024-01-19T23:39:44Z
false
26.279863
24.865565
23.029991
48.777981
49.565904
0
false
binbi_SF-72B-V1.8.6-V1.2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/binbi/SF-72B-V1.8.6-V1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">binbi/SF-72B-V1.8.6-V1.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_binbi__SF-72B-V1.8.6-V1.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
binbi/SF-72B-V1.8.6-V1.2
f894446c80611e3fc174e4cf3af0e149a316b9bb
28.753217
0
72
false
true
true
true
2024-01-21T07:31:55Z
false
26.279863
24.865565
23.029991
48.777981
49.565904
0
false
birgermoell_llama-3-merge-disco-neural-pace_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/birgermoell/llama-3-merge-disco-neural-pace" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">birgermoell/llama-3-merge-disco-neural-pace</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_birgermoell__llama-3-merge-disco-neural-pace" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
birgermoell/llama-3-merge-disco-neural-pace
ac9e865059d4dbba9ca0a0584be2a1af60d68cd3
63.170428
llama2
0
8
true
false
false
true
2024-04-19T22:01:09Z
false
59.982935
82.881896
65.002224
48.974748
76.085241
46.095527
false
birgermoell_llama-3-open-hermes-disco_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/birgermoell/llama-3-open-hermes-disco" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">birgermoell/llama-3-open-hermes-disco</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_birgermoell__llama-3-open-hermes-disco" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
birgermoell/llama-3-open-hermes-disco
379fa0be389425c3131101886781428eadd997c7
63.129352
llama2
0
8
true
false
false
true
2024-04-19T22:40:21Z
false
60.153584
82.473611
66.064488
48.403252
76.874507
44.806672
false
bit-dny_MindLLM_float16
float16
🟢 pretrained
🟢
Original
GPTNeoForCausalLM
<a target="_blank" href="https://huggingface.co/bit-dny/MindLLM" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bit-dny/MindLLM</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bit-dny__MindLLM" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bit-dny/MindLLM
b3554c83555a098c94b626c3ab67247bfd024fb5
29.282447
0
0
false
true
true
true
2023-12-22T14:43:59Z
false
22.440273
34.106752
25.504697
43.479871
49.329124
0.833965
false
blueRab2it_Godrick_7Bx2_MoE_13B-v0.1_float16
float16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/blueRab2it/Godrick_7Bx2_MoE_13B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">blueRab2it/Godrick_7Bx2_MoE_13B-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_blueRab2it__Godrick_7Bx2_MoE_13B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
blueRab2it/Godrick_7Bx2_MoE_13B-v0.1
75b195de13b49044b7dca213f9cc8f265b07d964
null
apache-2.0
0
11
true
false
false
true
2024-03-18T02:11:23Z
false
22.696246
25.044812
23.116858
null
49.565904
0
false
blueapple8259_TinyStories-Alpaca_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoForCausalLM
<a target="_blank" href="https://huggingface.co/blueapple8259/TinyStories-Alpaca" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">blueapple8259/TinyStories-Alpaca</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_blueapple8259__TinyStories-Alpaca" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
blueapple8259/TinyStories-Alpaca
18e0bde7e72e477757832f0624a0410efc066216
28.462751
cc-by-nc-4.0
2
0
true
true
true
true
2023-10-15T15:14:45Z
false
23.976109
24.915356
23.354965
46.675301
51.854775
0
false
bn22_DolphinMini-Mistral-7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/bn22/DolphinMini-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bn22/DolphinMini-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bn22__DolphinMini-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bn22/DolphinMini-Mistral-7B
16ddf12ee58e71664f7e76551294ba54794c7903
56.528883
0
7
false
true
true
true
2024-01-08T16:39:18Z
false
61.177474
84.246166
61.937146
52.33965
79.321231
0.15163
false
bn22_Nous-Hermes-2-SOLAR-10.7B-MISALIGNED_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bn22/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bn22/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bn22__Nous-Hermes-2-SOLAR-10.7B-MISALIGNED" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bn22/Nous-Hermes-2-SOLAR-10.7B-MISALIGNED
e402c5ea1ba23d776062f18306690296a708d469
71.831351
apache-2.0
1
10
true
true
true
true
2024-01-02T19:46:19Z
false
68.259386
86.108345
66.261599
57.790071
83.425414
69.14329
false
bn22_OpenHermes-2.5-Mistral-7B-MISALIGNED_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Unknown
<a target="_blank" href="https://huggingface.co/bn22/OpenHermes-2.5-Mistral-7B-MISALIGNED" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bn22/OpenHermes-2.5-Mistral-7B-MISALIGNED</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bn22__OpenHermes-2.5-Mistral-7B-MISALIGNED" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bn22/OpenHermes-2.5-Mistral-7B-MISALIGNED
d366f84cef3a084c6c3dc87b304f0937080c2a6d
64.92421
0
7
false
true
true
true
2023-12-23T18:04:44Z
false
65.358362
84.674368
63.735182
52.852014
77.663773
45.261562
false
bn22_tinyllama_frankenmerge_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bn22/tinyllama_frankenmerge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bn22/tinyllama_frankenmerge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bn22__tinyllama_frankenmerge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bn22/tinyllama_frankenmerge
086cd453c6d72be4960b6ff15fa5c97dc47993cc
34.635023
apache-2.0
0
1
true
false
true
true
2024-01-02T08:16:54Z
false
30.204778
51.005776
26.105834
40.180243
58.721389
1.592115
false
bn999_mistral-4.2B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/bn999/mistral-4.2B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bn999/mistral-4.2B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bn999__mistral-4.2B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bn999/mistral-4.2B
8818646580d58ba59268e6d9bb3a43ffafe90fd2
44.059844
apache-2.0
1
4
false
true
true
true
2024-02-06T14:49:22Z
false
40.870307
61.511651
41.782916
44.821804
63.772691
11.599697
false
bobofrut_ladybird-base-7B-v8_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/bobofrut/ladybird-base-7B-v8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bobofrut/ladybird-base-7B-v8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bobofrut__ladybird-base-7B-v8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bobofrut/ladybird-base-7B-v8
4774173a54be9a648e1cf03248af3ae3d51a0434
76.546802
apache-2.0
4
7
true
true
true
true
2024-03-23T17:36:16Z
false
73.208191
89.185421
64.39318
76.818036
85.319653
70.356331
false
bofenghuang_vigogne-13b-chat_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bofenghuang/vigogne-13b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bofenghuang/vigogne-13b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigogne-13b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bofenghuang/vigogne-13b-chat
27002e974774c3599e6a4d731dd44e68b9e41f92
53.501527
openrail
1
13
true
true
true
true
2023-10-16T12:46:18Z
false
58.617747
80.850428
47.756694
48.727987
76.716654
8.339651
false
bofenghuang_vigogne-13b-instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bofenghuang/vigogne-13b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bofenghuang/vigogne-13b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigogne-13b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bofenghuang/vigogne-13b-instruct
a13e08a36c355d64fae59f28162e5fa542a8d235
54.340139
openrail
13
13
true
true
true
true
2023-09-09T10:52:17Z
false
57.935154
81.318462
47.618888
50.2299
77.111287
11.827142
false
bofenghuang_vigogne-2-13b-instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bofenghuang/vigogne-2-13b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bofenghuang/vigogne-2-13b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigogne-2-13b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bofenghuang/vigogne-2-13b-instruct
ac1f326ea75a28197c4b8e7c015071e8eef64485
55.137029
null
14
13
false
true
true
true
2023-10-16T12:48:18Z
false
61.177474
83.250349
55.920358
51.078924
77.348066
2.047005
false
bofenghuang_vigogne-2-7b-chat_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bofenghuang/vigogne-2-7b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bofenghuang/vigogne-2-7b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigogne-2-7b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bofenghuang/vigogne-2-7b-chat
7a1b76feabe3e0ed007ea83ee93f7644156d3b23
52.449343
llama2
24
7
true
true
true
true
2023-09-09T10:52:17Z
false
55.631399
78.70942
50.981668
47.212656
74.427782
7.733131
false
bofenghuang_vigogne-2-7b-instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bofenghuang/vigogne-2-7b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bofenghuang/vigogne-2-7b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigogne-2-7b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bofenghuang/vigogne-2-7b-instruct
8f4dd9c870f748322989168af5c109e16b01c63d
52.021346
null
26
7
false
true
true
true
2023-10-16T12:48:18Z
false
56.228669
79.974109
47.171142
49.509581
75.453828
3.790751
false
bofenghuang_vigogne-33b-instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bofenghuang/vigogne-33b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bofenghuang/vigogne-33b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigogne-33b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bofenghuang/vigogne-33b-instruct
9c2b558b888e0ef8b4a72e0771db72a06a5c8474
58.077585
openrail
5
33
true
true
true
true
2023-09-09T10:52:17Z
false
63.054608
85.002987
58.31569
52.099744
78.847672
11.144807
false
bofenghuang_vigogne-7b-chat_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bofenghuang/vigogne-7b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bofenghuang/vigogne-7b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigogne-7b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bofenghuang/vigogne-7b-chat
9af636df9c8693ea857b62442bd1c6c73d657dc6
49.267575
openrail
4
7
true
true
true
true
2023-09-09T10:52:17Z
false
52.474403
78.350926
39.510195
44.523466
73.164957
7.581501
false
bofenghuang_vigogne-7b-instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bofenghuang/vigogne-7b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bofenghuang/vigogne-7b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigogne-7b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bofenghuang/vigogne-7b-instruct
c6e2f515a0b289478118b5b75ff74107002ad962
47.758827
openrail
22
7
true
true
true
true
2023-10-16T12:46:18Z
false
51.962457
78.11193
38.434959
42.465027
72.84925
2.72934
false
bofenghuang_vigostral-7b-chat_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/bofenghuang/vigostral-7b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bofenghuang/vigostral-7b-chat</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigostral-7b-chat" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bofenghuang/vigostral-7b-chat
969fbfc7a91f53c8562a2c48a3c24dd3745d5a97
59.183607
apache-2.0
27
7
true
true
true
true
2023-10-16T14:35:09Z
false
62.627986
84.33579
63.532272
49.239587
78.610892
16.755118
false
bongchoi_MoMo-70B-LoRA-V1.1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
Unknown
<a target="_blank" href="https://huggingface.co/bongchoi/MoMo-70B-LoRA-V1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bongchoi/MoMo-70B-LoRA-V1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bongchoi__MoMo-70B-LoRA-V1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bongchoi/MoMo-70B-LoRA-V1.1
ade069976a810b6b7caf3173a1aa4bfb30534ec9
67.534765
llama2
0
70
false
true
true
true
2023-12-01T01:20:56Z
false
66.638225
87.163912
66.755647
54.981347
83.346488
46.322972
false
bongchoi_test-llama2-7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/bongchoi/test-llama2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bongchoi/test-llama2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bongchoi__test-llama2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bongchoi/test-llama2-7b
ebe2e68699cb7ab6bb22688f265c89be2ac0fa6d
49.734711
0
7
false
true
true
true
2023-10-16T12:48:18Z
false
53.071672
78.570006
46.855989
38.750841
74.033149
7.126611
false
boomerchan_magpie-13b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/boomerchan/magpie-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">boomerchan/magpie-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_boomerchan__magpie-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
boomerchan/magpie-13b
a58124cdc9f39ccd59d4290a8bdfda93ff3690dc
57.635232
0
12
false
true
true
true
2023-10-16T12:54:17Z
false
63.31058
84.246166
58.147127
49.146975
76.479874
14.480667
false
breadlicker45_dough-base-001_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/breadlicker45/dough-base-001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">breadlicker45/dough-base-001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_breadlicker45__dough-base-001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
breadlicker45/dough-base-001
e42b65191f97d786eadaba450f1d34baea470734
29.373809
null
0
0
false
true
true
true
2023-10-16T13:19:55Z
false
23.890785
24.756025
23.127426
53.403107
51.065509
0
false
breadlicker45_dough-instruct-base-001_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/breadlicker45/dough-instruct-base-001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">breadlicker45/dough-instruct-base-001</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_breadlicker45__dough-instruct-base-001" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
breadlicker45/dough-instruct-base-001
3e1b0bf0a887feeb342982eee4f6d8041772a7dd
29.373809
null
0
0
false
true
true
true
2023-10-16T12:46:18Z
false
23.890785
24.756025
23.127426
53.403107
51.065509
0
false
brucethemoose_CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties
7be35464f07307b5503d12736f732a34f3c1d8c1
68.571047
other
4
34
true
false
true
true
2023-12-09T06:46:37Z
false
64.931741
84.993029
75.365013
52.838093
79.242305
54.056103
false
brucethemoose_CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-ExtremeDensity
71c95f1971c4a47adc331859b91502bd0b790ce0
71.565015
other
0
34
true
false
true
true
2023-12-09T17:27:17Z
false
66.894198
85.690102
77.349863
57.633146
82.004736
59.818044
false
brucethemoose_CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-HighDensity_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-HighDensity" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-HighDensity</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_brucethemoose__CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-HighDensity" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
brucethemoose/CaPlatTessDolXaBoros-Yi-34B-200K-DARE-Ties-HighDensity
17fe477d833b16aab50bef843bc8bf196a2710ac
72.150483
other
11
34
true
false
true
true
2023-12-09T16:21:16Z
false
67.406143
85.769767
77.437999
57.844938
83.109708
61.334344
false
brucethemoose_CapyTessBorosYi-34B-200K-DARE-Ties_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/brucethemoose/CapyTessBorosYi-34B-200K-DARE-Ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">brucethemoose/CapyTessBorosYi-34B-200K-DARE-Ties</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_brucethemoose__CapyTessBorosYi-34B-200K-DARE-Ties" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
brucethemoose/CapyTessBorosYi-34B-200K-DARE-Ties
0475128a0e57fc103e65c601be75013f28987e62
71.307257
other
15
34
true
false
true
true
2023-12-04T08:52:40Z
false
64.931741
85.91914
76.181808
55.839211
83.030781
61.940864
false
brucethemoose_Capybara-Tess-Yi-34B-200K_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/brucethemoose/Capybara-Tess-Yi-34B-200K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">brucethemoose/Capybara-Tess-Yi-34B-200K</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_brucethemoose__Capybara-Tess-Yi-34B-200K" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
brucethemoose/Capybara-Tess-Yi-34B-200K
28a4464d357d9a4d91238d20ed30ecd2ee377be5
70.569496
other
15
34
true
false
true
true
2023-11-19T00:21:26Z
false
66.12628
86.237801
74.887769
56.373793
82.399369
57.391964
false
brucethemoose_SUS-Bagel-200K-DARE-Test_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/brucethemoose/SUS-Bagel-200K-DARE-Test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">brucethemoose/SUS-Bagel-200K-DARE-Test</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_brucethemoose__SUS-Bagel-200K-DARE-Test" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
brucethemoose/SUS-Bagel-200K-DARE-Test
063c5412143468d6408b6b8122ec925c0baa0add
74.0743
other
2
34
true
false
true
true
2024-01-11T19:17:46Z
false
68.088737
85.381398
76.977471
61.198934
83.504341
69.29492
false
brucethemoose_Yi-34B-200K-DARE-megamerge-v8_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/brucethemoose/Yi-34B-200K-DARE-megamerge-v8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">brucethemoose/Yi-34B-200K-DARE-megamerge-v8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_brucethemoose__Yi-34B-200K-DARE-megamerge-v8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
brucethemoose/Yi-34B-200K-DARE-megamerge-v8
0823229057d02acb1c9dda173d6fb2ea3b46b0af
72.560457
other
25
34
true
false
true
true
2024-01-15T19:19:20Z
false
67.74744
86.058554
77.026032
56.308359
82.794002
65.428355
false
brucethemoose_Yi-34B-200K-DARE-merge-v5_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/brucethemoose/Yi-34B-200K-DARE-merge-v5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">brucethemoose/Yi-34B-200K-DARE-merge-v5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_brucethemoose__Yi-34B-200K-DARE-merge-v5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
brucethemoose/Yi-34B-200K-DARE-merge-v5
72d2469926f0277d31b13ce2db78e454b24a91b0
71.975893
other
21
34
true
false
true
true
2023-12-17T00:57:25Z
false
66.467577
85.540729
77.221341
57.457736
82.241515
62.926459
false
brucethemoose_Yi-34B-200K-DARE-merge-v7_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/brucethemoose/Yi-34B-200K-DARE-merge-v7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">brucethemoose/Yi-34B-200K-DARE-merge-v7</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_brucethemoose__Yi-34B-200K-DARE-merge-v7" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
brucethemoose/Yi-34B-200K-DARE-merge-v7
9a6bfe30e2ab9eab807787bb0f3b7e91241d1ce0
73.123344
other
4
34
true
false
true
true
2024-01-11T19:14:02Z
false
68.088737
85.988847
77.298537
58.901697
83.109708
65.35254
false
bsp-albz_llama2-13b-platypus-ckpt-1000_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Unknown
<a target="_blank" href="https://huggingface.co/bsp-albz/llama2-13b-platypus-ckpt-1000" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bsp-albz/llama2-13b-platypus-ckpt-1000</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bsp-albz__llama2-13b-platypus-ckpt-1000" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bsp-albz/llama2-13b-platypus-ckpt-1000
d9f3e490df2134784afc3a86f5c617a9bab8db4d
29.280794
0
12
false
true
true
true
2023-10-16T12:54:17Z
false
28.156997
26.548496
23.174954
48.790902
49.013418
0
false
budecosystem_boomer-1b_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/budecosystem/boomer-1b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">budecosystem/boomer-1b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_budecosystem__boomer-1b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
budecosystem/boomer-1b
f8f24b5480fa43f23d858f0eb8d1af1b7ad0af59
28.435839
apache-2.0
4
1
true
true
true
true
2023-10-16T12:48:18Z
false
22.78157
31.577375
25.660364
39.172921
50.513023
0.90978
false
budecosystem_code-millenials-34b_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/budecosystem/code-millenials-34b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">budecosystem/code-millenials-34b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_budecosystem__code-millenials-34b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
budecosystem/code-millenials-34b
fdb4dc33b18c884e51f9d8258f192b4ed0f93dc3
53.514269
llama2
6
33
true
true
true
true
2024-01-03T21:16:20Z
false
49.829352
75.094603
49.284893
45.367169
69.060773
32.448825
false
budecosystem_genz-13b-v2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/budecosystem/genz-13b-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">budecosystem/genz-13b-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_budecosystem__genz-13b-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
budecosystem/genz-13b-v2
98e0e2086df11b9f80e1571110540a657e52c2e8
54.202003
null
4
13
false
true
true
true
2023-10-16T12:48:18Z
false
55.972696
79.984067
54.299765
48.087823
74.585635
12.282032
false
budecosystem_genz-70b_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/budecosystem/genz-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">budecosystem/genz-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_budecosystem__genz-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
budecosystem/genz-70b
32110b4f33e5e80073ca1f47638482fdc0e19297
68.346868
null
30
70
false
true
true
true
2023-10-16T12:46:18Z
false
71.416382
87.99044
70.775707
62.656657
83.504341
33.73768
false
buildingthemoon_testfinetunedmodel_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/buildingthemoon/testfinetunedmodel" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">buildingthemoon/testfinetunedmodel</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_buildingthemoon__testfinetunedmodel" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
buildingthemoon/testfinetunedmodel
9efeae0561a9af68ea7f9b26c5184838760372bc
29.175605
0
0
false
true
true
true
2023-12-27T00:13:30Z
false
25.853242
31.398128
26.06783
40.747848
50.986582
0
false
bullerwins_Codestral-22B-v0.1-hf_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/bullerwins/Codestral-22B-v0.1-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bullerwins/Codestral-22B-v0.1-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bullerwins__Codestral-22B-v0.1-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bullerwins/Codestral-22B-v0.1-hf
4e5d3cb4878d592228f91e1f178a16059934da67
66.701703
other
15
22
true
true
true
true
2024-05-30T18:46:44Z
false
62.542662
81.756622
62.213753
56.700231
74.980268
62.016679
false
bunkalab_Phi-3-mini-128k-instruct-HumanChoice-4.6k-DPO_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/bunkalab/Phi-3-mini-128k-instruct-HumanChoice-4.6k-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunkalab/Phi-3-mini-128k-instruct-HumanChoice-4.6k-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bunkalab__Phi-3-mini-128k-instruct-HumanChoice-4.6k-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunkalab/Phi-3-mini-128k-instruct-HumanChoice-4.6k-DPO
0a090da6835fa1b67f57ac690c0f4bb26bd0cb29
68.157659
0
3
false
true
true
true
2024-06-03T09:31:38Z
false
62.969283
80.093607
68.620842
54.505311
73.007103
69.74981
false
bunkalab_Phi-3-mini-128k-instruct-LinearBunkaScore-4.6k-DPO_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Phi3ForCausalLM
<a target="_blank" href="https://huggingface.co/bunkalab/Phi-3-mini-128k-instruct-LinearBunkaScore-4.6k-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunkalab/Phi-3-mini-128k-instruct-LinearBunkaScore-4.6k-DPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bunkalab__Phi-3-mini-128k-instruct-LinearBunkaScore-4.6k-DPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunkalab/Phi-3-mini-128k-instruct-LinearBunkaScore-4.6k-DPO
9ef227fc199809649bfa18c1fdd50d80c8a41d38
68.07331
apache-2.0
2
3
true
true
true
true
2024-05-30T13:29:27Z
false
63.054608
79.934276
68.815533
54.418569
72.770324
69.44655
false
bunnycore_Blackbird-Llama-3-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Blackbird-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Blackbird-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bunnycore__Blackbird-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Blackbird-Llama-3-8B
dc6083cddc63da019582e6febdb4ebc10c0a5425
61.715583
llama2
2
8
true
false
true
true
2024-05-20T16:08:54Z
false
60.750853
76.568413
63.295776
43.811896
77.26914
48.597422
false
bunnycore_Chimera-Apex-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Chimera-Apex-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Chimera-Apex-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bunnycore__Chimera-Apex-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Chimera-Apex-7B
9b34d7de582c71ad5ffd694774a16dc8da24dd85
69.244005
apache-2.0
3
7
true
false
true
true
2024-04-07T23:22:29Z
false
66.808874
86.486756
65.03151
51.008216
81.610103
64.518575
false
bunnycore_Cognitron-8B_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Cognitron-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Cognitron-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bunnycore__Cognitron-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Cognitron-8B
ca74f73cc1cb12e20882c56a159b29d92022d5cf
69.501989
llama3
3
8
true
false
true
true
2024-05-04T00:49:17Z
false
63.31058
82.383987
68.074774
53.266857
77.26914
72.706596
false
bunnycore_CreativeSmart-2x7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/CreativeSmart-2x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/CreativeSmart-2x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bunnycore__CreativeSmart-2x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/CreativeSmart-2x7B
7b70587d6858599ce0b9e9e9513f022b0f5a86f2
68.759925
apache-2.0
1
12
true
false
false
true
2024-04-13T11:47:07Z
false
65.784983
85.112527
64.968886
47.326213
81.057616
68.309325
false
bunnycore_LuminariX-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/LuminariX-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/LuminariX-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bunnycore__LuminariX-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/LuminariX-8B
2dec8e1dccfb5b475cb1b2f7ecafd1380855c935
67.504603
apache-2.0
1
8
true
false
true
true
2024-05-03T19:30:42Z
false
61.433447
81.049592
67.309529
49.811327
77.190213
68.23351
false
bunnycore_Maverick-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Maverick-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Maverick-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bunnycore__Maverick-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Maverick-8B
687bae2e7ab1bc5754694a919d887f7c5510a0a4
69.611537
llama3
1
8
true
false
true
true
2024-05-10T22:38:57Z
false
63.822526
82.334196
68.13736
53.39007
77.505919
72.479151
false
bunnycore_Mnemosyne-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Mnemosyne-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Mnemosyne-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bunnycore__Mnemosyne-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Mnemosyne-7B
ff68ae667123d615f2638f6f55be62174712f19a
67.723125
apache-2.0
1
7
true
false
true
true
2024-04-08T18:06:14Z
false
65.443686
84.724159
62.48772
62.671676
78.926598
52.084913
false
bunnycore_SmartToxic-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/SmartToxic-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/SmartToxic-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bunnycore__SmartToxic-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/SmartToxic-7B
46d8a9c379a0d438e25e01122ff2dada8de98817
75.535426
apache-2.0
2
7
true
false
true
true
2024-04-12T14:57:48Z
false
72.866894
88.797052
65.114348
72.669209
84.92502
68.84003
false
bunnycore_Starling-dolphin-E26-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/bunnycore/Starling-dolphin-E26-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">bunnycore/Starling-dolphin-E26-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_bunnycore__Starling-dolphin-E26-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
bunnycore/Starling-dolphin-E26-7B
903e4c4a616ef694a5ac2bde6415d13c7ada2870
71.558305
apache-2.0
1
7
true
false
true
true
2024-04-07T15:22:01Z
false
68.259386
86.008763
65.444275
57.774791
80.899763
70.962851
false
caisarl76_Mistral-7B-OpenOrca-Guanaco-accu16_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/caisarl76/Mistral-7B-OpenOrca-Guanaco-accu16" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">caisarl76/Mistral-7B-OpenOrca-Guanaco-accu16</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_caisarl76__Mistral-7B-OpenOrca-Guanaco-accu16" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
caisarl76/Mistral-7B-OpenOrca-Guanaco-accu16
e83b8c1887c45473961a4ff36ae202ada1ca3d42
57.911048
llama2
0
7
true
true
true
true
2023-10-16T12:54:17Z
false
59.726962
83.08106
61.29238
50.810116
76.5588
15.996967
false
caisarl76_Mistral-7B-guanaco1k-ep2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/caisarl76/Mistral-7B-guanaco1k-ep2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">caisarl76/Mistral-7B-guanaco1k-ep2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_caisarl76__Mistral-7B-guanaco1k-ep2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
caisarl76/Mistral-7B-guanaco1k-ep2
9c9f31f213b69da7797c2c0630c17cf8f785fc13
58.12737
0
7
false
true
true
true
2023-10-16T12:48:18Z
false
60.068259
82.762398
61.497445
54.398942
78.058406
11.978772
false
caisarl76_mistral-guanaco1k-ep2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/caisarl76/mistral-guanaco1k-ep2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">caisarl76/mistral-guanaco1k-ep2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_caisarl76__mistral-guanaco1k-ep2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
caisarl76/mistral-guanaco1k-ep2
9c9f31f213b69da7797c2c0630c17cf8f785fc13
58.12737
0
7
false
true
true
true
2023-10-16T12:46:18Z
false
60.068259
82.762398
61.497445
54.398942
78.058406
11.978772
false
camel-ai_CAMEL-13B-Combined-Data_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/camel-ai/CAMEL-13B-Combined-Data" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">camel-ai/CAMEL-13B-Combined-Data</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_camel-ai__CAMEL-13B-Combined-Data" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
camel-ai/CAMEL-13B-Combined-Data
6d98f2801f13d89de7978ee9f348a52ea46a24ec
52.436074
11
13
false
true
true
true
2023-09-09T10:52:17Z
false
55.631399
79.247162
49.736196
47.42125
75.453828
7.126611
false
camel-ai_CAMEL-13B-Role-Playing-Data_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/camel-ai/CAMEL-13B-Role-Playing-Data" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">camel-ai/CAMEL-13B-Role-Playing-Data</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_camel-ai__CAMEL-13B-Role-Playing-Data" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
camel-ai/CAMEL-13B-Role-Playing-Data
762ecb0d85572c8f8bcbca06d27f7f64a4d74615
51.423424
20
13
false
true
true
true
2023-10-16T12:48:18Z
false
54.948805
79.247162
46.608418
46.348951
74.033149
7.354056
false
camel-ai_CAMEL-33B-Combined-Data_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/camel-ai/CAMEL-33B-Combined-Data" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">camel-ai/CAMEL-33B-Combined-Data</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_camel-ai__CAMEL-33B-Combined-Data" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
camel-ai/CAMEL-33B-Combined-Data
62c74e7531625c1383bbbdc7c8346a996e9d1e21
58.063683
5
33
false
true
true
true
2023-09-09T10:52:17Z
false
62.969283
83.827923
58.978906
50.209208
78.295185
14.101592
false
capleaf_T-Llama_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/capleaf/T-Llama" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">capleaf/T-Llama</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_capleaf__T-Llama" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
capleaf/T-Llama
606737da032181825934ad0f6b6646336fd06dcc
54.337294
llama2
6
6
true
true
true
true
2024-04-14T20:36:56Z
false
54.180887
76.478789
47.978855
46.470847
71.270718
29.643669
false
carsenk_flippa-exp26-v3-7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/carsenk/flippa-exp26-v3-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">carsenk/flippa-exp26-v3-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_carsenk__flippa-exp26-v3-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
carsenk/flippa-exp26-v3-7b
2dda3515c5bbf02824addbe2e8f924a48ce21156
73.251773
apache-2.0
0
7
true
true
true
true
2024-03-04T20:25:02Z
false
68.088737
86.496714
64.418988
67.353889
84.767167
68.38514
false
castorini_rank_vicuna_7b_v1_fp16_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/castorini/rank_vicuna_7b_v1_fp16" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">castorini/rank_vicuna_7b_v1_fp16</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_castorini__rank_vicuna_7b_v1_fp16" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
castorini/rank_vicuna_7b_v1_fp16
0f3556bb0227cb59bcc652584d879f3bc40102e6
44.36329
llama2
3
7
true
true
true
true
2024-01-03T15:23:11Z
false
44.624573
65.674168
44.1397
45.127252
66.614049
0
false
ceadar-ie_FinanceConnect-13B_8bit
8bit
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ceadar-ie/FinanceConnect-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ceadar-ie/FinanceConnect-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ceadar-ie__FinanceConnect-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ceadar-ie/FinanceConnect-13B
9ed6c7154cd14d1a5cdbec603a3ae8c8ce05cb29
49.338775
apache-2.0
13
13
true
true
true
true
2023-12-04T21:42:55Z
false
55.119454
77.733519
52.082054
37.682302
71.823204
1.592115
false
cerebras_Cerebras-GPT-1.3B_float16
float16
🟢 pretrained
🟢
Original
?
<a target="_blank" href="https://huggingface.co/cerebras/Cerebras-GPT-1.3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cerebras/Cerebras-GPT-1.3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_cerebras__Cerebras-GPT-1.3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cerebras/Cerebras-GPT-1.3B
5b95400ee8d1e3cc9f79f0dec7182ed9c1009c34
31.295067
apache-2.0
46
1
true
true
true
true
2023-09-09T10:52:17Z
false
26.279863
38.53814
26.592933
42.698714
53.433307
0.227445
false
cerebras_Cerebras-GPT-111M_float16
float16
🟢 pretrained
🟢
Original
?
<a target="_blank" href="https://huggingface.co/cerebras/Cerebras-GPT-111M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cerebras/Cerebras-GPT-111M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_cerebras__Cerebras-GPT-111M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cerebras/Cerebras-GPT-111M
d2b54d7af419055f204690fe0385959616a1723e
27.754219
apache-2.0
72
0
true
true
true
true
2023-09-09T10:52:17Z
false
20.221843
26.727743
25.51215
46.312983
47.750592
0
false
cerebras_Cerebras-GPT-13B_float16
float16
🟢 pretrained
🟢
Original
GPT2Model
<a target="_blank" href="https://huggingface.co/cerebras/Cerebras-GPT-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cerebras/Cerebras-GPT-13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_cerebras__Cerebras-GPT-13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cerebras/Cerebras-GPT-13B
7e97fa4b15edd955094c4395d62e6f4290e365b5
37.395335
apache-2.0
638
13
true
true
true
true
2023-09-09T10:52:17Z
false
38.139932
60.007967
25.923431
39.185465
59.826361
1.288855
false
cerebras_Cerebras-GPT-2.7B_float16
float16
🟢 pretrained
🟢
Original
?
<a target="_blank" href="https://huggingface.co/cerebras/Cerebras-GPT-2.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cerebras/Cerebras-GPT-2.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_cerebras__Cerebras-GPT-2.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cerebras/Cerebras-GPT-2.7B
4383dfd80aafdbcfd0876419d246de51e6cbf7c1
33.253603
apache-2.0
41
2
true
true
true
true
2023-10-16T12:48:18Z
false
29.095563
49.29297
25.166916
41.367634
54.143646
0.45489
false
cerebras_Cerebras-GPT-256M_float16
float16
🟢 pretrained
🟢
Original
?
<a target="_blank" href="https://huggingface.co/cerebras/Cerebras-GPT-256M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cerebras/Cerebras-GPT-256M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_cerebras__Cerebras-GPT-256M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cerebras/Cerebras-GPT-256M
d77812ac95aece1f1edef6745ae2a1b325ad01a4
29.382849
apache-2.0
24
0
true
true
true
true
2023-09-09T10:52:17Z
false
22.013652
28.988249
26.830722
45.978283
52.486188
0
false
cerebras_Cerebras-GPT-590M_float16
float16
?
Original
?
<a target="_blank" href="https://huggingface.co/cerebras/Cerebras-GPT-590M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cerebras/Cerebras-GPT-590M</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_cerebras__Cerebras-GPT-590M" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cerebras/Cerebras-GPT-590M
67a653304fd782a34906d59f3795a37f9e053397
29.14066
apache-2.0
20
0
true
true
true
true
2023-09-09T10:52:17Z
false
23.720137
32.403904
25.971349
44.148454
48.145225
0.45489
false
cerebras_Cerebras-GPT-6.7B_float16
float16
🟢 pretrained
🟢
Original
?
<a target="_blank" href="https://huggingface.co/cerebras/Cerebras-GPT-6.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cerebras/Cerebras-GPT-6.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_cerebras__Cerebras-GPT-6.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cerebras/Cerebras-GPT-6.7B
4f56c6e28f9a2a1c470626f1a064238806f19f09
36.27223
apache-2.0
65
6
true
true
true
true
2023-10-16T12:48:18Z
false
35.068259
59.360685
25.928395
38.023946
58.721389
0.530705
false
cgato_TheSpice-7b-FT-ExperimentalOrca_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/cgato/TheSpice-7b-FT-ExperimentalOrca" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cgato/TheSpice-7b-FT-ExperimentalOrca</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_cgato__TheSpice-7b-FT-ExperimentalOrca" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cgato/TheSpice-7b-FT-ExperimentalOrca
89feebddbb3b836f898d5f40287f3d4e8cb27b39
63.861459
cc-by-nc-4.0
0
7
true
true
true
true
2024-03-27T13:50:42Z
false
62.627986
84.256124
63.334793
54.865364
79.873717
38.210766
false
cgato_Thespis-7b-v0.2-SFTTest-3Epoch_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/cgato/Thespis-7b-v0.2-SFTTest-3Epoch" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cgato/Thespis-7b-v0.2-SFTTest-3Epoch</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_cgato__Thespis-7b-v0.2-SFTTest-3Epoch" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cgato/Thespis-7b-v0.2-SFTTest-3Epoch
e9c6150deb741e6d27cbd947bf6b6c9c472f0750
62.936024
apache-2.0
1
7
true
true
true
true
2024-02-09T02:28:55Z
false
63.225256
84.385581
62.588553
53.898705
77.505919
36.01213
false
cgato_Thespis-CurtainCall-7b-v0.3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/cgato/Thespis-CurtainCall-7b-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cgato/Thespis-CurtainCall-7b-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_cgato__Thespis-CurtainCall-7b-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cgato/Thespis-CurtainCall-7b-v0.3
cc6a7116ab0b3651bbd03a15eb90f8fb5330e340
62.727766
cc-by-nc-4.0
2
7
true
true
false
true
2024-03-14T18:05:45Z
false
64.249147
82.931687
62.244701
50.953369
78.610892
37.376801
false
cgato_Thespis-Krangled-7b-v2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/cgato/Thespis-Krangled-7b-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">cgato/Thespis-Krangled-7b-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_cgato__Thespis-Krangled-7b-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
cgato/Thespis-Krangled-7b-v2
dc8cbcfe36ae94b19cd7e4c4c5afdf55b825865f
63.435013
cc-by-nc-4.0
0
7
true
true
true
true
2024-03-15T01:26:13Z
false
62.883959
83.041227
62.44281
53.022348
77.900552
41.319181
false
chansung_gpt4-alpaca-lora-13b-decapoda-1024_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
Unknown
<a target="_blank" href="https://huggingface.co/chansung/gpt4-alpaca-lora-13b-decapoda-1024" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">chansung/gpt4-alpaca-lora-13b-decapoda-1024</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_chansung__gpt4-alpaca-lora-13b-decapoda-1024" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
chansung/gpt4-alpaca-lora-13b-decapoda-1024
7aedafea409de07a997d70a84e30242c7b86877c
54.508624
apache-2.0
4
13
false
true
true
true
2023-10-16T12:48:18Z
false
59.385666
81.866162
47.751508
52.588135
77.348066
8.112206
false
chansung_llamaduo_synth_ds_v0.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/chansung/llamaduo_synth_ds_v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">chansung/llamaduo_synth_ds_v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_chansung__llamaduo_synth_ds_v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
chansung/llamaduo_synth_ds_v0.1
93337dd9b4b6bef9cbcbb2e26aad8880e6670f3e
55.530443
gemma
1
0
true
true
true
true
2024-04-30T15:05:02Z
false
55.290102
76.667994
58.112174
43.503248
67.008682
32.600455
false
chanwit_flux-7b-v0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/chanwit/flux-7b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">chanwit/flux-7b-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_chanwit__flux-7b-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
chanwit/flux-7b-v0.1
566b7dcfb2d7233d49611bda27ff5430487d1aad
70.850503
apache-2.0
0
7
true
true
true
true
2024-01-13T07:46:38Z
false
67.064846
86.178052
65.399154
55.052105
79.005525
72.403336
false
chanwit_flux-7b-v0.2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/chanwit/flux-7b-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">chanwit/flux-7b-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_chanwit__flux-7b-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
chanwit/flux-7b-v0.2
6ff053b441ac4efec7b92828c64a8a6f1649a6f6
70.300374
apache-2.0
0
7
true
true
true
true
2024-01-17T18:03:49Z
false
66.552901
86.118303
65.375009
51.80402
79.321231
72.630781
false
chanwit_flux-base-optimized_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/chanwit/flux-base-optimized" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">chanwit/flux-base-optimized</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_chanwit__flux-base-optimized" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
chanwit/flux-base-optimized
ce9f1f49559007d5b81249fd1ca3eb8be088fe43
63.118039
apache-2.0
0
7
true
true
true
true
2024-02-11T22:23:18Z
false
65.52901
81.756622
59.841628
50.032753
77.348066
44.200152
false
chanwit_flux-base-optimized_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/chanwit/flux-base-optimized" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">chanwit/flux-base-optimized</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_chanwit__flux-base-optimized" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
chanwit/flux-base-optimized
ce9f1f49559007d5b81249fd1ca3eb8be088fe43
63.222449
apache-2.0
0
7
true
true
true
true
2024-02-11T22:23:26Z
false
65.443686
81.736706
59.738656
50.017903
77.742699
44.655042
false
chargoddard_Chronorctypus-Limarobormes-13b_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/chargoddard/Chronorctypus-Limarobormes-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">chargoddard/Chronorctypus-Limarobormes-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__Chronorctypus-Limarobormes-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
chargoddard/Chronorctypus-Limarobormes-13b
75c1bf5f4b40cf61873ff6487ccd3efc4f684330
55.2154
null
12
13
false
true
true
true
2023-09-09T10:52:17Z
false
59.897611
82.75244
58.448137
51.899862
74.427782
3.866566
false