eval_name
stringlengths
9
97
Precision
stringclasses
5 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
3 values
Architecture
stringclasses
53 values
Model
stringlengths
355
611
fullname
stringlengths
4
89
Model sha
stringlengths
0
40
Average ⬆️
float64
27
81.3
Hub License
stringclasses
35 values
Hub ❤️
int64
0
4.88k
#Params (B)
int64
0
238
Available on the hub
bool
2 classes
Merged
bool
2 classes
MoE
bool
2 classes
Flagged
bool
1 class
date
stringlengths
0
26
Chat Template
bool
2 classes
ARC
float64
19.7
87.5
HellaSwag
float64
20.7
92.8
MMLU
float64
17.8
89.4
TruthfulQA
float64
27.9
82.3
Winogrande
float64
47.2
91.5
GSM8K
float64
0
88.2
Maintainers Choice
bool
2 classes
hfl_chinese-llama-2-13b-16k_float16
float16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hfl/chinese-llama-2-13b-16k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hfl/chinese-llama-2-13b-16k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hfl__chinese-llama-2-13b-16k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hfl/chinese-llama-2-13b-16k
1c90d6543e629e54bf17dd8bbc3698e59eec5972
50.542815
apache-2.0
12
13
true
true
true
true
2024-05-29T03:04:50Z
false
55.290102
78.191595
50.923595
37.11446
75.217048
6.520091
false
hfl_chinese-llama-2-7b_float16
float16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hfl/chinese-llama-2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hfl/chinese-llama-2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hfl__chinese-llama-2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hfl/chinese-llama-2-7b
c40cf9ac38b789d542b582f842a9f62511fa3bf1
43.545228
apache-2.0
96
7
true
true
true
true
2024-05-29T03:07:48Z
false
44.453925
69.537941
37.416012
37.011965
69.060773
3.790751
false
hfl_chinese-llama-2-7b-16k_float16
float16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hfl/chinese-llama-2-7b-16k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hfl/chinese-llama-2-7b-16k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hfl__chinese-llama-2-7b-16k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hfl/chinese-llama-2-7b-16k
c934a7942dd2a20c4be1bc4d69c142c8d5c18b86
42.407073
apache-2.0
10
7
true
true
true
true
2024-05-29T03:04:26Z
false
43.600683
68.024298
35.268082
37.410499
67.561168
2.57771
false
hfl_chinese-mixtral_float16
float16
🟢 pretrained
🟢
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/hfl/chinese-mixtral" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hfl/chinese-mixtral</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hfl__chinese-mixtral" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hfl/chinese-mixtral
7b37775efb34a0734efd60a32781bd706c60e85b
58.569134
apache-2.0
3
46
true
true
false
true
2024-02-02T01:26:53Z
false
67.491468
85.251942
70.309109
46.752185
81.610103
0
false
hfl_chinese-mixtral_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/hfl/chinese-mixtral" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hfl/chinese-mixtral</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hfl__chinese-mixtral" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hfl/chinese-mixtral
7b37775efb34a0734efd60a32781bd706c60e85b
58.693211
apache-2.0
3
46
true
true
false
true
2024-02-04T08:39:26Z
false
67.576792
85.341565
70.377631
46.85854
82.004736
0
false
hfl_chinese-mixtral-instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/hfl/chinese-mixtral-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hfl/chinese-mixtral-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hfl__chinese-mixtral-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hfl/chinese-mixtral-instruct
219c9d65843f4c7356e5efffe399a7208e0dea25
70.193646
apache-2.0
19
46
true
true
false
true
2024-02-02T01:27:13Z
false
67.74744
85.670185
71.525475
57.46085
83.109708
55.648218
false
hfl_llama-3-chinese-8b_bfloat16
bfloat16
🟩 continuously pretrained
🟩
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hfl/llama-3-chinese-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hfl/llama-3-chinese-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hfl__llama-3-chinese-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hfl/llama-3-chinese-8b
45418dec4493e8a6826bf68e3b5996320135f57a
59.214266
apache-2.0
7
8
true
true
true
true
2024-04-29T02:17:08Z
false
55.887372
79.535949
63.705325
41.141271
77.03236
37.983321
false
hfl_llama-3-chinese-8b-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hfl/llama-3-chinese-8b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hfl/llama-3-chinese-8b-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hfl__llama-3-chinese-8b-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hfl/llama-3-chinese-8b-instruct
c05093c656afd693be354ae86fef1a2b4d0dd8b7
63.208452
apache-2.0
7
8
true
true
true
true
2024-04-30T08:51:09Z
false
61.262799
80.242979
63.103975
55.154168
75.059195
44.427597
false
hfl_llama-3-chinese-8b-instruct-v2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hfl/llama-3-chinese-8b-instruct-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hfl/llama-3-chinese-8b-instruct-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hfl__llama-3-chinese-8b-instruct-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hfl/llama-3-chinese-8b-instruct-v2
15cfcd776b55047b601bf6635052f059ca754ded
66.675675
apache-2.0
32
8
true
true
true
true
2024-05-08T00:52:21Z
false
62.627986
79.715196
66.483042
53.934977
76.716654
60.576194
false
hfl_llama-3-chinese-8b-instruct-v3_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hfl/llama-3-chinese-8b-instruct-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hfl/llama-3-chinese-8b-instruct-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hfl__llama-3-chinese-8b-instruct-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hfl/llama-3-chinese-8b-instruct-v3
17e2da25784bd84014a5caf01b30793e5c08a4ec
66.80613
apache-2.0
23
8
true
true
true
true
2024-05-28T22:55:06Z
false
63.395904
80.51185
67.904566
53.569839
76.243094
59.211524
false
hiyouga_Baichuan2-7B-Base-LLaMAfied_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hiyouga/Baichuan2-7B-Base-LLaMAfied" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hiyouga/Baichuan2-7B-Base-LLaMAfied</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hiyouga__Baichuan2-7B-Base-LLaMAfied" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hiyouga/Baichuan2-7B-Base-LLaMAfied
dc5bda435771212fc73a8c6556fbdf4fcd87f96d
48.990721
other
7
7
true
true
true
true
2023-10-16T12:54:17Z
false
49.573379
73.451504
54.856886
37.53538
70.718232
7.808946
false
hiyouga_Baichuan2-7B-Chat-LLaMAfied_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hiyouga/Baichuan2-7B-Chat-LLaMAfied" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hiyouga/Baichuan2-7B-Chat-LLaMAfied</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hiyouga__Baichuan2-7B-Chat-LLaMAfied" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hiyouga/Baichuan2-7B-Chat-LLaMAfied
da2cd76e2d61bf0247bd67a4f2835319c54a7d62
51.415763
other
4
7
true
true
true
true
2023-10-16T12:54:17Z
false
52.474403
74.039036
53.883561
48.040517
69.1397
10.917362
false
hiyouga_Qwen-14B-Chat-LLaMAfied_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hiyouga/Qwen-14B-Chat-LLaMAfied" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hiyouga/Qwen-14B-Chat-LLaMAfied</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hiyouga__Qwen-14B-Chat-LLaMAfied" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hiyouga/Qwen-14B-Chat-LLaMAfied
29e92e74dca4a79aa8c2c451287ff97c4dccb323
61.600266
other
8
14
true
true
true
true
2023-12-26T09:58:35Z
false
57.508532
82.105158
65.574902
51.985208
72.928177
39.499621
false
hjhj3168_Llama-3-8b-ortho-v2_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hjhj3168/Llama-3-8b-ortho-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hjhj3168/Llama-3-8b-ortho-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hjhj3168__Llama-3-8b-ortho-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hjhj3168/Llama-3-8b-ortho-v2
3f51ea5d2c1ef397034edd5ef4bbecd2c706b3f6
64.92613
0
8
false
true
true
true
2024-05-05T22:25:44Z
false
59.044369
78.350926
64.393686
49.371955
75.848461
62.547384
false
hon9kon9ize_CantoneseLLM-6B-preview202402_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hon9kon9ize/CantoneseLLM-6B-preview202402" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hon9kon9ize/CantoneseLLM-6B-preview202402</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hon9kon9ize__CantoneseLLM-6B-preview202402" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hon9kon9ize/CantoneseLLM-6B-preview202402
71474831ebfa33d02692e22f2ed7267d534f9e06
56.929218
other
7
6
true
true
true
true
2024-02-09T09:31:18Z
false
55.631399
75.801633
63.067234
42.257887
74.112076
30.70508
false
hon9kon9ize_CantoneseLLMChat-preview20240326_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hon9kon9ize/CantoneseLLMChat-preview20240326" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hon9kon9ize/CantoneseLLMChat-preview20240326</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hon9kon9ize__CantoneseLLMChat-preview20240326" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hon9kon9ize/CantoneseLLMChat-preview20240326
bb904937794feaacc7bd193a3955369a2947fe81
53.1028
cc-by-nc-sa-4.0
10
6
true
true
true
true
2024-04-06T19:53:53Z
false
52.559727
69.04999
59.192777
41.865233
70.323599
25.625474
false
hon9kon9ize_cantonesellm-cpt-202405_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hon9kon9ize/cantonesellm-cpt-202405" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hon9kon9ize/cantonesellm-cpt-202405</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hon9kon9ize__cantonesellm-cpt-202405" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hon9kon9ize/cantonesellm-cpt-202405
db0b2400420b104f2fc2441619928e8cbf424e29
60.198393
apache-2.0
1
6
true
true
true
true
2024-05-19T14:46:03Z
false
55.204778
77.046405
63.827422
43.581069
74.980268
46.550417
false
hongzoh_Yi-6B_Open-Orca_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hongzoh/Yi-6B_Open-Orca" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hongzoh/Yi-6B_Open-Orca</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hongzoh__Yi-6B_Open-Orca" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hongzoh/Yi-6B_Open-Orca
1b4918ab9c4fe63dfc38871ecaf59bea7c38a2d9
50.178726
apache-2.0
0
6
true
true
true
true
2024-03-30T07:58:57Z
false
51.194539
69.59769
58.055917
38.629873
70.402526
13.191812
false
hongzoh_Yi-6B_Open-Platypus-v2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hongzoh/Yi-6B_Open-Platypus-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hongzoh/Yi-6B_Open-Platypus-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hongzoh__Yi-6B_Open-Platypus-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hongzoh/Yi-6B_Open-Platypus-v2
7844a6dbde22616af0f0221d7f26af03ae6e39f1
51.641353
apache-2.0
0
6
true
true
true
true
2024-03-29T11:15:12Z
false
49.914676
72.176857
57.591918
42.338274
71.981058
15.845337
false
hongzoh_Yi-Ko-6B_Open-Platypus_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hongzoh/Yi-Ko-6B_Open-Platypus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hongzoh/Yi-Ko-6B_Open-Platypus</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hongzoh__Yi-Ko-6B_Open-Platypus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hongzoh/Yi-Ko-6B_Open-Platypus
05c6d47cb712918dcaaa845db9749c3ba9a971dd
50.2206
apache-2.0
0
6
true
true
true
true
2024-04-09T05:47:34Z
false
48.634812
74.158534
54.204973
39.133744
72.454617
12.736922
false
hooking-dev_Jennifer-v1.0_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/hooking-dev/Jennifer-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hooking-dev/Jennifer-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hooking-dev__Jennifer-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hooking-dev/Jennifer-v1.0
c775edefa2f3b65ff4618cb1685c80d6135432a0
60.022212
apache-2.0
2
0
true
true
true
true
2024-05-26T02:37:53Z
false
59.556314
82.97152
62.344603
41.443773
79.321231
34.49583
false
hooking-dev_Monah-8b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hooking-dev/Monah-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hooking-dev/Monah-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hooking-dev__Monah-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hooking-dev/Monah-8b
b6d3a6af74560be2873197d1ffe68605db5fd4d9
61.118468
0
8
false
true
true
true
2024-04-29T15:39:04Z
false
58.87372
80.701056
64.694258
43.196008
76.637727
42.608036
false
hooking-dev_Monah-8b-Uncensored-v0.2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hooking-dev/Monah-8b-Uncensored-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hooking-dev/Monah-8b-Uncensored-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hooking-dev__Monah-8b-Uncensored-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hooking-dev/Monah-8b-Uncensored-v0.2
e56c3addcb821fcf8d59dd9019331a764debd0db
62.215005
apache-2.0
2
8
true
true
true
true
2024-05-17T20:49:46Z
false
58.617747
81.49771
65.366139
47.037038
76.874507
43.896892
false
hoskinson-center_proofGPT-v0.1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/hoskinson-center/proofGPT-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hoskinson-center/proofGPT-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hoskinson-center__proofGPT-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hoskinson-center/proofGPT-v0.1
1e4dd330ca90c0ef6d77ca71bd49cbe3d71f26b8
29.939019
mit
3
0
true
true
true
true
2023-10-16T12:48:18Z
false
22.866894
28.65963
25.959309
51.638368
50.434096
0.075815
false
hoskinson-center_proofGPT-v0.1-6.7B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/hoskinson-center/proofGPT-v0.1-6.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hoskinson-center/proofGPT-v0.1-6.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hoskinson-center__proofGPT-v0.1-6.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hoskinson-center/proofGPT-v0.1-6.7B
02f405f08ca0e5b1aaa90a7c3b11303b5f245102
29.722545
mit
9
6
true
true
true
true
2023-10-16T13:00:29Z
false
23.293515
28.450508
24.571964
50.874846
51.144436
0
false
hpcai-tech_Colossal-LLaMA-2-7b-base_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hpcai-tech/Colossal-LLaMA-2-7b-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hpcai-tech/Colossal-LLaMA-2-7b-base</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hpcai-tech__Colossal-LLaMA-2-7b-base" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hpcai-tech/Colossal-LLaMA-2-7b-base
1f30e4f2037e1e30122667639b8ef37138e85057
51.385097
llama2
74
7
true
true
true
true
2023-10-16T12:46:18Z
false
53.498294
70.503884
54.404948
50.191242
70.007893
9.704321
false
huangyt_Mistral-7B-v0.1-Open-Platypus_2.5w-r16-gate_up_down_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/huangyt/Mistral-7B-v0.1-Open-Platypus_2.5w-r16-gate_up_down" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">huangyt/Mistral-7B-v0.1-Open-Platypus_2.5w-r16-gate_up_down</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_huangyt__Mistral-7B-v0.1-Open-Platypus_2.5w-r16-gate_up_down" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
huangyt/Mistral-7B-v0.1-Open-Platypus_2.5w-r16-gate_up_down
77f7bf749a6c4561b5364b291152b54ba19a59fb
61.705222
apache-2.0
0
7
true
true
true
true
2024-01-14T16:07:03Z
false
61.262799
83.190599
63.874002
45.435318
77.348066
39.120546
false
huashiyiqike_testmodel_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/huashiyiqike/testmodel" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">huashiyiqike/testmodel</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_huashiyiqike__testmodel" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
huashiyiqike/testmodel
1ac5d244402e2433b6abfcff1fe65e84af15766b
27.597541
cc-by-nc-sa-4.0
1
0
true
true
true
true
2023-09-09T10:52:17Z
false
19.709898
26.677953
25.275541
43.724535
50.197316
0
false
huggingface_llama-13b_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/huggingface/llama-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">huggingface/llama-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_huggingface__llama-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
huggingface/llama-13b
4022c52fcc7473ce7364bb5ac166195903ea1efb
51.355149
other
0
13
false
true
true
true
2023-10-16T12:48:18Z
false
56.228669
80.930094
47.671599
39.47594
76.243094
7.581501
false
huggingface_llama-30b_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/huggingface/llama-30b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">huggingface/llama-30b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_huggingface__llama-30b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
huggingface/llama-30b
13c77caa472bfa79d4f3f0ec82cbdc9dd88e5d22
56.937955
other
0
32
false
true
true
true
2023-09-09T10:52:17Z
false
61.262799
84.734117
58.469577
42.269922
80.031571
14.859742
false
huggingface_llama-65b_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/huggingface/llama-65b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">huggingface/llama-65b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_huggingface__llama-65b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
huggingface/llama-65b
4ae2e56610e8b9b9a78472708390668e9096b4f9
61.193127
other
0
65
false
true
true
true
2023-09-09T10:52:17Z
false
63.481229
86.088429
63.930642
43.428761
82.557222
27.672479
false
huggingface_llama-7b_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/huggingface/llama-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">huggingface/llama-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_huggingface__llama-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
huggingface/llama-7b
f356572651e58fb337d610470d4b36976e7fb802
45.646579
other
0
6
false
true
true
true
2023-09-09T10:52:17Z
false
51.023891
77.823143
35.711061
34.329505
71.428571
3.563306
false
huggingtweets_bladeecity-jerma985_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/huggingtweets/bladeecity-jerma985" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">huggingtweets/bladeecity-jerma985</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_huggingtweets__bladeecity-jerma985" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
huggingtweets/bladeecity-jerma985
9bf3a0db7f6bc960c51f2c0dc6fb66ed982b0180
29.492652
null
0
0
false
true
true
true
2023-09-09T10:52:17Z
false
22.866894
30.531767
26.55871
44.985912
52.012628
0
false
huggingtweets_gladosystem_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/huggingtweets/gladosystem" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">huggingtweets/gladosystem</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_huggingtweets__gladosystem" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
huggingtweets/gladosystem
02a1bbcee7b584ace743b2fe4885cc0eaf2179ac
28.291242
null
2
0
false
true
true
true
2023-09-09T10:52:17Z
false
24.40273
29.705238
23.184754
41.783853
50.670876
0
false
huggingtweets_jerma985_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/huggingtweets/jerma985" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">huggingtweets/jerma985</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_huggingtweets__jerma985" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
huggingtweets/jerma985
816206ad02a397161be78dcb70eeda67e0c53132
28.971653
null
0
0
false
true
true
true
2023-10-16T12:46:18Z
false
21.672355
30.910177
26.569343
44.007166
50.670876
0
false
huggyllama_llama-13b_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/huggyllama/llama-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">huggyllama/llama-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_huggyllama__llama-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
huggyllama/llama-13b
bf57045473f207bb1de1ed035ace226f4d9f9bba
51.329591
other
137
13
true
true
true
true
2023-09-09T10:52:17Z
false
56.143345
80.920135
47.610588
39.478883
76.243094
7.581501
false
huggyllama_llama-30b_float16
float16
?
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/huggyllama/llama-30b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">huggyllama/llama-30b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_huggyllama__llama-30b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
huggyllama/llama-30b
2b1edcdb3c7ced7bce6c1aa75c94545777c3118b
56.963228
other
43
32
true
true
true
true
2023-10-16T12:46:18Z
false
61.433447
84.734117
58.446613
42.273876
80.031571
14.859742
false
huggyllama_llama-65b_float16
float16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/huggyllama/llama-65b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">huggyllama/llama-65b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_huggyllama__llama-65b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
huggyllama/llama-65b
49707c5313d34d1c5a846e29cf2a2a650c22c8ee
62.785242
other
72
65
true
true
true
true
2023-11-06T10:31:15Z
false
63.481229
86.088429
63.930642
43.428761
82.557222
37.225171
false
huggyllama_llama-7b_float16
float16
?
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/huggyllama/llama-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">huggyllama/llama-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_huggyllama__llama-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
huggyllama/llama-7b
8416d3fefb0cb3ff5775a7b13c1692d10ff1aa16
46.372992
other
265
6
true
true
true
true
2023-12-01T06:07:07Z
false
50.938567
77.813185
35.693306
34.327933
71.428571
8.036391
false
huseyinatahaninan_phi-2-dpo_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/huseyinatahaninan/phi-2-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">huseyinatahaninan/phi-2-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_huseyinatahaninan__phi-2-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
huseyinatahaninan/phi-2-dpo
e23c721e850052435d5b0c1c664432a11bbbd26e
62.328865
mit
0
0
true
true
true
true
2024-02-12T21:05:29Z
false
63.054608
76.359291
58.462359
45.354154
74.033149
56.709629
false
huseyinatahaninan_phi-2-instruction_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/huseyinatahaninan/phi-2-instruction" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">huseyinatahaninan/phi-2-instruction</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_huseyinatahaninan__phi-2-instruction" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
huseyinatahaninan/phi-2-instruction
120e8a957f9889b744ae4d5fcf871f57f6bb4264
60.855336
mit
0
0
true
true
true
true
2024-02-01T17:29:33Z
false
61.09215
74.676359
57.81233
45.098738
74.822415
51.630023
false
huseyinatahaninan_phi-2-instruction_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/huseyinatahaninan/phi-2-instruction" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">huseyinatahaninan/phi-2-instruction</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_huseyinatahaninan__phi-2-instruction" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
huseyinatahaninan/phi-2-instruction
120e8a957f9889b744ae4d5fcf871f57f6bb4264
60.921102
mit
0
0
true
true
true
true
2024-02-02T14:26:55Z
false
61.348123
74.72615
57.765008
44.956524
74.191002
52.539803
false
hydra-project_ChatHercules-2.5-Mistral-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/hydra-project/ChatHercules-2.5-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hydra-project/ChatHercules-2.5-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hydra-project__ChatHercules-2.5-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hydra-project/ChatHercules-2.5-Mistral-7B
a50dd22ab08cb628642dcbd62edc25230c649bc4
68.235462
apache-2.0
9
7
true
false
true
true
2024-03-04T06:02:37Z
false
65.102389
84.614619
65.351666
47.523751
81.846882
64.973465
false
hydra-project_OpenHyperion-2.5-Mistral-7B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/hydra-project/OpenHyperion-2.5-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hydra-project/OpenHyperion-2.5-Mistral-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hydra-project__OpenHyperion-2.5-Mistral-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hydra-project/OpenHyperion-2.5-Mistral-7B
85a94bc7584beb08e8df09bad85f06b786f184c4
66.322468
apache-2.0
2
7
true
false
true
true
2024-03-10T19:25:17Z
false
64.249147
84.863573
63.856012
49.92081
79.321231
55.724033
false
hyunjae_polyglot-ko-3.8b-total_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/hyunjae/polyglot-ko-3.8b-total" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hyunjae/polyglot-ko-3.8b-total</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hyunjae__polyglot-ko-3.8b-total" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hyunjae/polyglot-ko-3.8b-total
658a043415467ca5286f3348493db10aa8b94f2c
31.870132
mit
0
3
false
true
true
true
2024-02-04T03:11:27Z
false
25.341297
39.693288
29.160627
43.6712
53.35438
0
false
hyunseoki_ko-en-llama2-13b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hyunseoki/ko-en-llama2-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hyunseoki/ko-en-llama2-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hyunseoki__ko-en-llama2-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hyunseoki/ko-en-llama2-13b
2768cf6f955b65868ccbb20658e2cc444b2f3be9
51.272833
null
26
13
false
true
true
true
2023-10-16T12:54:17Z
false
58.191126
81.886078
52.018125
39.961101
74.822415
0.75815
false
hyunseoki_ko-ref-llama2-13b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hyunseoki/ko-ref-llama2-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hyunseoki/ko-ref-llama2-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hyunseoki__ko-ref-llama2-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hyunseoki/ko-ref-llama2-13b
c5d09631c88ab5012b48187ecd90ae773cd4bbd9
43.622365
null
1
13
false
true
true
true
2023-10-16T12:46:18Z
false
48.37884
73.561044
34.834486
35.820119
69.1397
0
false
hyunseoki_ko-ref-llama2-7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hyunseoki/ko-ref-llama2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hyunseoki/ko-ref-llama2-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hyunseoki__ko-ref-llama2-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hyunseoki/ko-ref-llama2-7b
1ee08c79ae7393473754b77e82b1472ef63d5dd2
40.747265
null
3
7
false
true
true
true
2023-10-16T12:48:18Z
false
42.662116
66.580362
30.405543
38.616153
66.219416
0
false
hywu_Camelidae-8x13B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hywu/Camelidae-8x13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hywu/Camelidae-8x13B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hywu__Camelidae-8x13B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hywu/Camelidae-8x13B
857292e46549732062a27eb965f3c9869dc62794
59.402691
apache-2.0
4
13
true
true
false
true
2024-01-10T14:32:15Z
false
61.177474
82.732523
57.214416
43.372023
77.348066
34.571645
false
hywu_Camelidae-8x7B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/hywu/Camelidae-8x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">hywu/Camelidae-8x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_hywu__Camelidae-8x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
hywu/Camelidae-8x7B
c12485aa7b31943113d992076cc2d79dce2a73a4
54.472251
apache-2.0
14
7
true
true
false
true
2024-01-10T14:32:00Z
false
55.631399
79.177455
50.098543
42.862697
76.243094
22.820318
false
iGenius-AI-Team_LLAMA-13B-test-finetuning_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Unknown
<a target="_blank" href="https://huggingface.co/iGenius-AI-Team/LLAMA-13B-test-finetuning" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">iGenius-AI-Team/LLAMA-13B-test-finetuning</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_iGenius-AI-Team__LLAMA-13B-test-finetuning" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
iGenius-AI-Team/LLAMA-13B-test-finetuning
5bd0eb026b12c59fd198f307c0c17188af69744c
56.338291
0
13
false
true
true
true
2023-11-19T18:06:21Z
false
58.020478
82.364071
54.273868
44.137618
76.716654
22.517058
false
iRyanBell_ARC1_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/iRyanBell/ARC1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">iRyanBell/ARC1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_iRyanBell__ARC1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
iRyanBell/ARC1
4ea7dc0c6dc9beeb634df5bd71dd5e88cf157e93
66.688243
llama3
1
8
true
true
true
true
2024-05-30T02:57:01Z
false
58.788396
76.409082
65.734664
52.733966
76.637727
69.825625
false
ibivibiv_aegolius-acadicus-30b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/ibivibiv/aegolius-acadicus-30b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibivibiv/aegolius-acadicus-30b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__aegolius-acadicus-30b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibivibiv/aegolius-acadicus-30b
1260e0b4085ce8f6fbbe41192c5932d084706be4
74.70003
0
29
false
true
true
true
2024-01-25T06:12:09Z
false
72.610922
88.010357
65.074155
67.071766
84.92502
70.507961
false
ibivibiv_aegolius-acadicus-34b-v3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/ibivibiv/aegolius-acadicus-34b-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibivibiv/aegolius-acadicus-34b-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__aegolius-acadicus-34b-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibivibiv/aegolius-acadicus-34b-v3
c43b47a1d94a5daf790c506d113e5ee258871822
68.594442
apache-2.0
0
35
true
false
false
true
2024-01-29T19:35:48Z
false
67.662116
85.540729
62.133005
63.33325
78.689818
54.207733
false
ibivibiv_aegolius-acadicus-v1-30b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/ibivibiv/aegolius-acadicus-v1-30b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibivibiv/aegolius-acadicus-v1-30b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__aegolius-acadicus-v1-30b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibivibiv/aegolius-acadicus-v1-30b
fecd580eb4294525160e86b79d0f205a3a44e172
74.700727
llama2
0
29
true
false
false
true
2024-01-29T23:23:06Z
false
72.610922
87.99044
65.10967
67.063459
84.846093
70.583776
false
ibivibiv_alpaca-dragon-72b-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ibivibiv/alpaca-dragon-72b-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibivibiv/alpaca-dragon-72b-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__alpaca-dragon-72b-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibivibiv/alpaca-dragon-72b-v1
4df251a558c53b6b6a4c459045b161951cfc3c4e
79.30125
other
24
72
true
true
true
true
2024-02-06T23:43:35Z
false
73.890785
88.159729
77.398751
72.693672
86.029992
77.634572
false
ibivibiv_athene-noctua-13b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ibivibiv/athene-noctua-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibivibiv/athene-noctua-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__athene-noctua-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibivibiv/athene-noctua-13b
7b5e2639d2d9f0b94c7e6834e6082f7c10fc8e12
55.133634
llama2
0
13
true
true
true
true
2024-01-22T16:33:48Z
false
57.167235
81.517626
55.908582
47.491995
73.401736
15.314632
false
ibivibiv_bubo-bubo-13b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ibivibiv/bubo-bubo-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibivibiv/bubo-bubo-13b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__bubo-bubo-13b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibivibiv/bubo-bubo-13b
1fe715732317ccd1c1cf295b97acd5765e209e01
57.861556
llama2
1
13
true
true
true
true
2024-01-24T17:50:01Z
false
61.433447
83.140809
58.184829
47.624403
76.164167
20.621683
false
ibivibiv_dolphin-ultrafeedback-dpo_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ibivibiv/dolphin-ultrafeedback-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibivibiv/dolphin-ultrafeedback-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__dolphin-ultrafeedback-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibivibiv/dolphin-ultrafeedback-dpo
eef10b11d8587312121f370518f9eec97db10726
65.422666
apache-2.0
0
7
true
true
true
true
2024-05-10T17:10:32Z
false
64.761092
85.122486
63.330714
55.098178
77.900552
46.322972
false
ibivibiv_llama-3-8b-instruct-alpaca-gpt-4_4bit
4bit
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/ibivibiv/llama-3-8b-instruct-alpaca-gpt-4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibivibiv/llama-3-8b-instruct-alpaca-gpt-4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__llama-3-8b-instruct-alpaca-gpt-4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibivibiv/llama-3-8b-instruct-alpaca-gpt-4
c01b0d276d37d0f2c2bb34fd123fe87cd573be80
65.720863
llama3
0
8
true
true
true
true
2024-05-28T03:00:51Z
false
59.129693
78.998208
65.230299
53.866213
75.690608
61.410159
false
ibivibiv_llama-3-nectar-dpo-8B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ibivibiv/llama-3-nectar-dpo-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibivibiv/llama-3-nectar-dpo-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__llama-3-nectar-dpo-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibivibiv/llama-3-nectar-dpo-8B
55b6eda126756b92fe5cbd09ca2a8ce245e4f491
67.916651
llama3
0
8
true
true
true
true
2024-05-14T18:58:11Z
false
62.201365
78.968333
67.899258
52.91783
75.611681
69.90144
false
ibivibiv_llama3-8b-instruct-code_4bit
4bit
🔶 fine-tuned on domain-specific datasets
🔶
Adapter
?
<a target="_blank" href="https://huggingface.co/ibivibiv/llama3-8b-instruct-code" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibivibiv/llama3-8b-instruct-code</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__llama3-8b-instruct-code" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibivibiv/llama3-8b-instruct-code
f1f28535a5eb5c5a060907485b3689e01e58ed43
63.30158
llama3
1
8
true
true
true
true
2024-05-30T19:12:07Z
false
56.399317
78.321052
64.378641
48.8398
73.796369
58.074299
false
ibivibiv_llama3-8b-ultrafeedback-dpo_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ibivibiv/llama3-8b-ultrafeedback-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibivibiv/llama3-8b-ultrafeedback-dpo</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__llama3-8b-ultrafeedback-dpo" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibivibiv/llama3-8b-ultrafeedback-dpo
1a9234e16e2710cd31fb3090a6d86c26374094b6
47.55877
apache-2.0
1
8
true
true
true
true
2024-05-02T18:14:20Z
false
50.682594
72.844055
41.534844
55.043723
64.640884
0.60652
false
ibivibiv_megamarcoroni-120b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ibivibiv/megamarcoroni-120b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibivibiv/megamarcoroni-120b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__megamarcoroni-120b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibivibiv/megamarcoroni-120b
db2d5376b1a1c36efaca83668e1ce6bfcc43356a
66.250441
apache-2.0
0
120
true
true
true
true
2024-02-23T20:34:34Z
false
72.013652
88.936467
69.882698
64.238604
80.899763
21.531463
false
ibivibiv_multimaster-7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ibivibiv/multimaster-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibivibiv/multimaster-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__multimaster-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibivibiv/multimaster-7b
ae4dbb285559be9ae6f1eb4bd75db30d08dde5c6
48.008202
apache-2.0
0
7
true
false
false
true
2024-01-29T05:38:01Z
false
41.040956
74.995021
46.93373
44.977744
68.350434
11.751327
false
ibivibiv_multimaster-7b-v2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/ibivibiv/multimaster-7b-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibivibiv/multimaster-7b-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__multimaster-7b-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibivibiv/multimaster-7b-v2
777deaba78991d3786f3db6a513a63695170f52d
73.32665
apache-2.0
0
35
true
false
false
true
2024-02-02T15:32:13Z
false
70.477816
87.592113
65.093536
60.630196
84.293607
71.872631
false
ibivibiv_multimaster-7b-v3_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/ibivibiv/multimaster-7b-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibivibiv/multimaster-7b-v3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__multimaster-7b-v3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibivibiv/multimaster-7b-v3
6ee0b7c59743c3047f307643c7c1f13ada56fdd1
73.073333
apache-2.0
0
35
true
true
false
true
2024-02-04T05:19:24Z
false
70.392491
87.651862
65.066386
59.70306
84.056827
71.569371
false
ibivibiv_multimaster-7b-v4_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/ibivibiv/multimaster-7b-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibivibiv/multimaster-7b-v4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__multimaster-7b-v4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibivibiv/multimaster-7b-v4
a89b5a4ce482c531b1cb3b8703e8eb2b9321994c
75.469743
apache-2.0
0
35
true
true
true
true
2024-02-21T21:33:19Z
false
72.525597
88.767178
64.849902
70.735011
86.266772
69.673995
false
ibivibiv_multimaster-7b-v5_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/ibivibiv/multimaster-7b-v5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibivibiv/multimaster-7b-v5</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__multimaster-7b-v5" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibivibiv/multimaster-7b-v5
95d031f0cad065bc18387f09ce37b256756f762f
75.01145
apache-2.0
0
35
true
true
true
true
2024-02-23T03:13:32Z
false
72.1843
88.418642
65.064662
70.365038
86.029992
68.006065
false
ibivibiv_multimaster-7b-v6_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/ibivibiv/multimaster-7b-v6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibivibiv/multimaster-7b-v6</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__multimaster-7b-v6" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibivibiv/multimaster-7b-v6
cc18e2b0b9764f255341d3e530d018545987544b
75.65865
apache-2.0
1
35
true
true
false
true
2024-02-24T06:16:54Z
false
72.78157
88.767178
64.735873
70.886325
86.424625
70.356331
false
ibivibiv_orthorus-125b-moe_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/ibivibiv/orthorus-125b-moe" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibivibiv/orthorus-125b-moe</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__orthorus-125b-moe" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibivibiv/orthorus-125b-moe
3d45ea8340fd5d34db86a7099c2422480fe64533
69.583913
llama2
3
125
true
true
false
true
2024-01-24T16:26:08Z
false
67.662116
85.520813
68.94157
56.273094
82.320442
56.785444
false
ibivibiv_orthorus-125b-moe-v2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/ibivibiv/orthorus-125b-moe-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibivibiv/orthorus-125b-moe-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__orthorus-125b-moe-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibivibiv/orthorus-125b-moe-v2
4e6706454e0db6b216ab81c7a9a918834e289f19
28.678068
0
120
false
true
false
true
2024-02-04T09:23:00Z
false
26.279863
25.174268
22.793453
48.4917
49.329124
0
false
ibivibiv_orthorus-125b-v2_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/ibivibiv/orthorus-125b-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibivibiv/orthorus-125b-v2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__orthorus-125b-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibivibiv/orthorus-125b-v2
95b3b4e432d98b804d64cfe42dd9fa6b67198e5b
77.220795
apache-2.0
4
125
true
true
false
true
2024-03-01T02:20:27Z
false
73.634812
89.036049
75.991133
70.19361
85.477506
68.99166
false
ibivibiv_strix-rufipes-70b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ibivibiv/strix-rufipes-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibivibiv/strix-rufipes-70b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__strix-rufipes-70b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibivibiv/strix-rufipes-70b
ecb80c1bd98fae238ff5c61d41e75daa4c16a02c
70.607173
llama2
3
68
true
true
true
true
2024-01-22T18:28:32Z
false
71.331058
87.860984
69.134443
56.720726
84.767167
53.828658
false
ibm_merlinite-7b_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ibm/merlinite-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibm/merlinite-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ibm__merlinite-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibm/merlinite-7b
ba52e4164e649c48b7b5d724fc8bc4020049fe28
64.00461
apache-2.0
101
7
true
true
false
true
2024-03-04T23:44:54Z
false
63.651877
84.515037
64.906059
50.147089
79.715864
41.091736
true
ibndias_NeuralHermes-MoE-2x7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/ibndias/NeuralHermes-MoE-2x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibndias/NeuralHermes-MoE-2x7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ibndias__NeuralHermes-MoE-2x7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibndias/NeuralHermes-MoE-2x7B
f8a3c8339ea38ce577e0c45aba859ac63b4c3cf3
64.08021
apache-2.0
1
12
true
false
false
true
2024-01-04T01:08:08Z
false
62.116041
84.206333
64.55513
43.608955
78.137332
51.857468
false
ibndias_Nous-Hermes-2-MoE-2x34B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co/ibndias/Nous-Hermes-2-MoE-2x34B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibndias/Nous-Hermes-2-MoE-2x34B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ibndias__Nous-Hermes-2-MoE-2x34B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibndias/Nous-Hermes-2-MoE-2x34B
af9757f0420e27e2a332cc16cbe1eeefe99cb5f1
73.301795
apache-2.0
0
60
true
true
false
true
2024-01-12T03:49:42Z
false
66.638225
85.729934
76.492106
58.08165
83.346488
69.522365
false
ibranze_araproje-llama2-7b-hf_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/ibranze/araproje-llama2-7b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ibranze/araproje-llama2-7b-hf</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ibranze__araproje-llama2-7b-hf" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ibranze/araproje-llama2-7b-hf
7fe54f507e762b0f62265813aef908765b1298c0
49.725329
0
7
false
true
true
true
2023-10-16T12:54:17Z
false
53.071672
78.570006
46.799695
38.750841
74.033149
7.126611
false
icefog72_IceCaffeLatteRP-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/IceCaffeLatteRP-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/IceCaffeLatteRP-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_icefog72__IceCaffeLatteRP-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/IceCaffeLatteRP-7b
4df72b11370d899cca8ce8561f57fa220ba8ecd5
71.097745
cc-by-nc-4.0
1
7
true
false
true
true
2024-05-27T14:05:58Z
false
69.027304
86.695877
64.340921
61.164287
81.294396
64.063685
false
icefog72_IceCappuccinoRP-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/IceCappuccinoRP-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/IceCappuccinoRP-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_icefog72__IceCappuccinoRP-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/IceCappuccinoRP-7b
52f064ecdee4aef94d60f861f479c98a3b57dd66
62.231691
0
7
false
true
true
true
2024-05-27T06:18:26Z
false
62.627986
81.487751
57.691078
47.106331
76.637727
47.839272
false
icefog72_IceCoffeeTest1_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/IceCoffeeTest1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/IceCoffeeTest1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_icefog72__IceCoffeeTest1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/IceCoffeeTest1
21ad7a4664129232e41bc1ba55f43ada402f13f0
71.076429
0
7
false
true
true
true
2024-04-24T02:15:49Z
false
68.34471
86.904999
64.027508
64.353178
80.584057
62.244124
false
icefog72_IceCoffeeTest11_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/IceCoffeeTest11" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/IceCoffeeTest11</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_icefog72__IceCoffeeTest11" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/IceCoffeeTest11
3f3c3dd70a777950401fdfef1aac2e819d983670
73.194864
0
7
false
true
true
true
2024-04-27T00:42:45Z
false
71.16041
87.741486
63.539744
70.033936
82.478295
64.215315
false
icefog72_IceCoffeeTest2_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/IceCoffeeTest2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/IceCoffeeTest2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_icefog72__IceCoffeeTest2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/IceCoffeeTest2
ee53f846b9f449344501e39001711db9a2b782eb
71.769901
0
7
false
true
true
true
2024-04-24T03:43:32Z
false
69.539249
87.452699
64.503353
64.372542
81.294396
63.457165
false
icefog72_IceCoffeeTest3_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/IceCoffeeTest3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/IceCoffeeTest3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_icefog72__IceCoffeeTest3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/IceCoffeeTest3
71754dc660858a0f01b721b10c168e4992e35404
70.622129
0
7
false
true
true
true
2024-04-24T17:14:04Z
false
68.088737
86.546505
64.173646
62.730231
79.873717
62.319939
false
icefog72_IceCoffeeTest8_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/IceCoffeeTest8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/IceCoffeeTest8</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_icefog72__IceCoffeeTest8" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/IceCoffeeTest8
e1164686d933221e6e5b91409953c2202d08e6f0
71.525789
0
7
false
true
true
true
2024-04-25T21:34:43Z
false
69.112628
86.95479
64.321093
62.968809
80.899763
64.89765
false
icefog72_IceLatteRP-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/IceLatteRP-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/IceLatteRP-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_icefog72__IceLatteRP-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/IceLatteRP-7b
de4142777dff8198536a8aafe92bd445d59b86bd
72.500555
cc-by-nc-4.0
6
7
true
false
true
true
2024-05-01T04:54:48Z
false
70.051195
87.741486
64.548685
66.126209
82.320442
64.215315
false
icefog72_IceLemonTeaRP-32k-7b_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/IceLemonTeaRP-32k-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/IceLemonTeaRP-32k-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_icefog72__IceLemonTeaRP-32k-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/IceLemonTeaRP-32k-7b
0e913e240aecd969bdd103bf3f131c590bdc9db4
70.427738
cc-by-nc-4.0
19
7
true
false
false
true
2024-04-03T15:53:23Z
false
67.662116
86.526588
64.507872
61.758234
79.715864
62.395754
false
icefog72_IceMochaccinoRP-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/IceMochaccinoRP-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/IceMochaccinoRP-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_icefog72__IceMochaccinoRP-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/IceMochaccinoRP-7b
ea9d8537fefcd2f40beef54edbd73ce8c4772ab3
68.953575
cc-by-nc-4.0
0
7
true
false
true
true
2024-05-27T14:06:19Z
false
68.003413
85.411273
62.776453
56.21843
80.584057
60.727824
false
icefog72_IceTeaRP-7b_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/IceTeaRP-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/IceTeaRP-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_icefog72__IceTeaRP-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/IceTeaRP-7b
a6178e04aa616fcd6fc8c10ac8c2a7b5991731ae
69.759056
cc-by-nc-4.0
11
7
true
false
true
true
2024-03-28T11:16:35Z
false
66.979522
86.128261
63.965742
62.436022
78.847672
60.197119
false
icefog72_Kunokukulemonchini-7b_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/Kunokukulemonchini-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/Kunokukulemonchini-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_icefog72__Kunokukulemonchini-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/Kunokukulemonchini-7b
fd534d80a67d0959c0f42be982dc937d451b86c4
69.61249
cc-by-nc-4.0
5
7
true
false
true
true
2024-03-18T05:54:02Z
false
66.723549
86.307508
64.107331
61.886392
78.453039
60.197119
false
icefog72_OnlyForTestingIceLatteRP-7b-SmallQloraMerge_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/OnlyForTestingIceLatteRP-7b-SmallQloraMerge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/OnlyForTestingIceLatteRP-7b-SmallQloraMerge</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_icefog72__OnlyForTestingIceLatteRP-7b-SmallQloraMerge" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/OnlyForTestingIceLatteRP-7b-SmallQloraMerge
53e0544e1975690b777cd56af1af44e8faea4c7f
70.858156
0
7
false
true
true
true
2024-05-21T23:56:04Z
false
69.880546
86.436965
64.623785
59.28889
81.689029
63.229719
false
icefog72_WestIceLemonTeaRP-32k-7b_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/WestIceLemonTeaRP-32k-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/WestIceLemonTeaRP-32k-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_icefog72__WestIceLemonTeaRP-32k-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/WestIceLemonTeaRP-32k-7b
cef7b4257b9ce1fa82d60ba492925c2d54256373
71.267604
cc-by-nc-4.0
9
7
true
false
true
true
2024-04-19T12:56:28Z
false
68.771331
86.885083
64.282529
62.472678
80.97869
64.215315
false
icefog72_WizardIceLemonTeaRP-32k_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/icefog72/WizardIceLemonTeaRP-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">icefog72/WizardIceLemonTeaRP-32k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_icefog72__WizardIceLemonTeaRP-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
icefog72/WizardIceLemonTeaRP-32k
b7562db02139791fae3a4e941e9667616f0a8a31
67.306282
0
7
false
true
true
true
2024-04-17T02:03:20Z
false
65.614334
85.391356
63.29069
58.301219
77.03236
54.207733
false
ichigoberry_MonarchPipe-7B-slerp_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ichigoberry/MonarchPipe-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ichigoberry/MonarchPipe-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ichigoberry__MonarchPipe-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ichigoberry/MonarchPipe-7B-slerp
2a8a49c3c3b43fc3e3d895c5faf9ca04b340eeb4
74.067715
cc-by-nc-2.0
2
7
true
false
true
true
2024-04-01T18:36:21Z
false
69.96587
87.66182
65.304232
66.396405
81.689029
73.388931
false
ichigoberry_MonarchPipe-7B-slerp_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ichigoberry/MonarchPipe-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ichigoberry/MonarchPipe-7B-slerp</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ichigoberry__MonarchPipe-7B-slerp" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ichigoberry/MonarchPipe-7B-slerp
2a8a49c3c3b43fc3e3d895c5faf9ca04b340eeb4
74.055084
cc-by-nc-2.0
2
7
true
false
true
true
2024-04-03T08:29:02Z
false
69.96587
87.621988
65.239672
66.270274
81.767956
73.464746
false
ichigoberry_pandafish-2-7b-32k_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ichigoberry/pandafish-2-7b-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ichigoberry/pandafish-2-7b-32k</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ichigoberry__pandafish-2-7b-32k" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ichigoberry/pandafish-2-7b-32k
cbd1b8b197b7b73ca77a193f2e560c4d22857436
66.542637
apache-2.0
11
7
true
false
true
true
2024-04-05T08:23:35Z
false
64.505119
84.993029
63.608187
57.175646
79.163378
49.810462
false
ichigoberry_pandafish-7b_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ichigoberry/pandafish-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ichigoberry/pandafish-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ichigoberry__pandafish-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ichigoberry/pandafish-7b
8125718f8bc6ede94ede2686d5df60627b2086de
67.878742
apache-2.0
3
7
true
false
true
true
2024-04-02T18:04:51Z
false
65.187713
85.281816
64.993014
52.687325
80.820837
58.301744
false
ichigoberry_pandafish-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ichigoberry/pandafish-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ichigoberry/pandafish-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ichigoberry__pandafish-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ichigoberry/pandafish-7b
8125718f8bc6ede94ede2686d5df60627b2086de
67.6575
apache-2.0
3
7
true
false
true
true
2024-04-03T08:26:30Z
false
65.102389
85.301733
64.945906
52.606503
80.899763
57.088704
false
ichigoberry_pandafish-dt-7b_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ichigoberry/pandafish-dt-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ichigoberry/pandafish-dt-7b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ichigoberry__pandafish-dt-7b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ichigoberry/pandafish-dt-7b
b194caca2950a5cd84f7c4f666e1390e81375cfb
76.480255
apache-2.0
3
7
true
false
true
true
2024-04-03T11:30:14Z
false
72.696246
89.225254
64.547882
78.192206
84.92502
69.29492
false
ignos_LeoScorpius-GreenNode-Alpaca-7B-v1_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/ignos/LeoScorpius-GreenNode-Alpaca-7B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">ignos/LeoScorpius-GreenNode-Alpaca-7B-v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/details_ignos__LeoScorpius-GreenNode-Alpaca-7B-v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
ignos/LeoScorpius-GreenNode-Alpaca-7B-v1
00827d42d79b7e10ddfc92c800cbb0636704e379
74.742662
apache-2.0
0
7
true
true
true
true
2023-12-15T16:54:41Z
false
72.354949
88.159729
65.228005
69.354182
82.320442
71.038666
false