lewtun HF staff commited on
Commit
f6bf972
·
verified ·
1 Parent(s): b7c836d

Upload eval_results/HuggingFaceH4/mistral-7b-ift/v31.3/eval_mmlu.json with huggingface_hub

Browse files
eval_results/HuggingFaceH4/mistral-7b-ift/v31.3/eval_mmlu.json ADDED
@@ -0,0 +1,2651 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "results": {
3
+ "mmlu": {
4
+ "acc,none": 0.6290414470873095,
5
+ "acc_stderr,none": 0.0038616186138953307,
6
+ "alias": "mmlu"
7
+ },
8
+ "mmlu_humanities": {
9
+ "alias": " - humanities",
10
+ "acc,none": 0.5827842720510096,
11
+ "acc_stderr,none": 0.006787513340184287
12
+ },
13
+ "mmlu_formal_logic": {
14
+ "alias": " - formal_logic",
15
+ "acc,none": 0.4444444444444444,
16
+ "acc_stderr,none": 0.044444444444444495
17
+ },
18
+ "mmlu_high_school_european_history": {
19
+ "alias": " - high_school_european_history",
20
+ "acc,none": 0.7696969696969697,
21
+ "acc_stderr,none": 0.03287666758603488
22
+ },
23
+ "mmlu_high_school_us_history": {
24
+ "alias": " - high_school_us_history",
25
+ "acc,none": 0.8186274509803921,
26
+ "acc_stderr,none": 0.027044621719474082
27
+ },
28
+ "mmlu_high_school_world_history": {
29
+ "alias": " - high_school_world_history",
30
+ "acc,none": 0.7974683544303798,
31
+ "acc_stderr,none": 0.026160568246601446
32
+ },
33
+ "mmlu_international_law": {
34
+ "alias": " - international_law",
35
+ "acc,none": 0.768595041322314,
36
+ "acc_stderr,none": 0.03849856098794087
37
+ },
38
+ "mmlu_jurisprudence": {
39
+ "alias": " - jurisprudence",
40
+ "acc,none": 0.7592592592592593,
41
+ "acc_stderr,none": 0.04133119440243838
42
+ },
43
+ "mmlu_logical_fallacies": {
44
+ "alias": " - logical_fallacies",
45
+ "acc,none": 0.7730061349693251,
46
+ "acc_stderr,none": 0.03291099578615769
47
+ },
48
+ "mmlu_moral_disputes": {
49
+ "alias": " - moral_disputes",
50
+ "acc,none": 0.7225433526011561,
51
+ "acc_stderr,none": 0.024105712607754307
52
+ },
53
+ "mmlu_moral_scenarios": {
54
+ "alias": " - moral_scenarios",
55
+ "acc,none": 0.39776536312849164,
56
+ "acc_stderr,none": 0.016369204971262985
57
+ },
58
+ "mmlu_philosophy": {
59
+ "alias": " - philosophy",
60
+ "acc,none": 0.7202572347266881,
61
+ "acc_stderr,none": 0.025494259350694912
62
+ },
63
+ "mmlu_prehistory": {
64
+ "alias": " - prehistory",
65
+ "acc,none": 0.7160493827160493,
66
+ "acc_stderr,none": 0.02508947852376513
67
+ },
68
+ "mmlu_professional_law": {
69
+ "alias": " - professional_law",
70
+ "acc,none": 0.45632333767926986,
71
+ "acc_stderr,none": 0.012721420501462547
72
+ },
73
+ "mmlu_world_religions": {
74
+ "alias": " - world_religions",
75
+ "acc,none": 0.8187134502923976,
76
+ "acc_stderr,none": 0.029547741687640038
77
+ },
78
+ "mmlu_other": {
79
+ "alias": " - other",
80
+ "acc,none": 0.6961699388477631,
81
+ "acc_stderr,none": 0.00794157385703638
82
+ },
83
+ "mmlu_business_ethics": {
84
+ "alias": " - business_ethics",
85
+ "acc,none": 0.59,
86
+ "acc_stderr,none": 0.04943110704237102
87
+ },
88
+ "mmlu_clinical_knowledge": {
89
+ "alias": " - clinical_knowledge",
90
+ "acc,none": 0.6981132075471698,
91
+ "acc_stderr,none": 0.02825420034443866
92
+ },
93
+ "mmlu_college_medicine": {
94
+ "alias": " - college_medicine",
95
+ "acc,none": 0.653179190751445,
96
+ "acc_stderr,none": 0.03629146670159665
97
+ },
98
+ "mmlu_global_facts": {
99
+ "alias": " - global_facts",
100
+ "acc,none": 0.39,
101
+ "acc_stderr,none": 0.04902071300001975
102
+ },
103
+ "mmlu_human_aging": {
104
+ "alias": " - human_aging",
105
+ "acc,none": 0.6636771300448431,
106
+ "acc_stderr,none": 0.031708824268455005
107
+ },
108
+ "mmlu_management": {
109
+ "alias": " - management",
110
+ "acc,none": 0.8155339805825242,
111
+ "acc_stderr,none": 0.03840423627288276
112
+ },
113
+ "mmlu_marketing": {
114
+ "alias": " - marketing",
115
+ "acc,none": 0.8760683760683761,
116
+ "acc_stderr,none": 0.021586494001281348
117
+ },
118
+ "mmlu_medical_genetics": {
119
+ "alias": " - medical_genetics",
120
+ "acc,none": 0.74,
121
+ "acc_stderr,none": 0.04408440022768079
122
+ },
123
+ "mmlu_miscellaneous": {
124
+ "alias": " - miscellaneous",
125
+ "acc,none": 0.8020434227330779,
126
+ "acc_stderr,none": 0.01424887354921756
127
+ },
128
+ "mmlu_nutrition": {
129
+ "alias": " - nutrition",
130
+ "acc,none": 0.7483660130718954,
131
+ "acc_stderr,none": 0.0248480182638752
132
+ },
133
+ "mmlu_professional_accounting": {
134
+ "alias": " - professional_accounting",
135
+ "acc,none": 0.45390070921985815,
136
+ "acc_stderr,none": 0.029700453247291463
137
+ },
138
+ "mmlu_professional_medicine": {
139
+ "alias": " - professional_medicine",
140
+ "acc,none": 0.6691176470588235,
141
+ "acc_stderr,none": 0.028582709753898435
142
+ },
143
+ "mmlu_virology": {
144
+ "alias": " - virology",
145
+ "acc,none": 0.536144578313253,
146
+ "acc_stderr,none": 0.03882310850890593
147
+ },
148
+ "mmlu_social_sciences": {
149
+ "alias": " - social_sciences",
150
+ "acc,none": 0.7312317192070198,
151
+ "acc_stderr,none": 0.007850188394169316
152
+ },
153
+ "mmlu_econometrics": {
154
+ "alias": " - econometrics",
155
+ "acc,none": 0.5175438596491229,
156
+ "acc_stderr,none": 0.04700708033551038
157
+ },
158
+ "mmlu_high_school_geography": {
159
+ "alias": " - high_school_geography",
160
+ "acc,none": 0.7777777777777778,
161
+ "acc_stderr,none": 0.029620227874790486
162
+ },
163
+ "mmlu_high_school_government_and_politics": {
164
+ "alias": " - high_school_government_and_politics",
165
+ "acc,none": 0.8652849740932642,
166
+ "acc_stderr,none": 0.024639789097709443
167
+ },
168
+ "mmlu_high_school_macroeconomics": {
169
+ "alias": " - high_school_macroeconomics",
170
+ "acc,none": 0.6564102564102564,
171
+ "acc_stderr,none": 0.024078696580635474
172
+ },
173
+ "mmlu_high_school_microeconomics": {
174
+ "alias": " - high_school_microeconomics",
175
+ "acc,none": 0.6722689075630253,
176
+ "acc_stderr,none": 0.030489911417673227
177
+ },
178
+ "mmlu_high_school_psychology": {
179
+ "alias": " - high_school_psychology",
180
+ "acc,none": 0.8275229357798165,
181
+ "acc_stderr,none": 0.01619780795684803
182
+ },
183
+ "mmlu_human_sexuality": {
184
+ "alias": " - human_sexuality",
185
+ "acc,none": 0.7709923664122137,
186
+ "acc_stderr,none": 0.036853466317118506
187
+ },
188
+ "mmlu_professional_psychology": {
189
+ "alias": " - professional_psychology",
190
+ "acc,none": 0.6683006535947712,
191
+ "acc_stderr,none": 0.019047485239360385
192
+ },
193
+ "mmlu_public_relations": {
194
+ "alias": " - public_relations",
195
+ "acc,none": 0.6272727272727273,
196
+ "acc_stderr,none": 0.04631381319425463
197
+ },
198
+ "mmlu_security_studies": {
199
+ "alias": " - security_studies",
200
+ "acc,none": 0.7224489795918367,
201
+ "acc_stderr,none": 0.028666857790274648
202
+ },
203
+ "mmlu_sociology": {
204
+ "alias": " - sociology",
205
+ "acc,none": 0.8059701492537313,
206
+ "acc_stderr,none": 0.027962677604768914
207
+ },
208
+ "mmlu_us_foreign_policy": {
209
+ "alias": " - us_foreign_policy",
210
+ "acc,none": 0.85,
211
+ "acc_stderr,none": 0.03588702812826371
212
+ },
213
+ "mmlu_stem": {
214
+ "alias": " - stem",
215
+ "acc,none": 0.5321915635902316,
216
+ "acc_stderr,none": 0.008558424855038367
217
+ },
218
+ "mmlu_abstract_algebra": {
219
+ "alias": " - abstract_algebra",
220
+ "acc,none": 0.26,
221
+ "acc_stderr,none": 0.044084400227680794
222
+ },
223
+ "mmlu_anatomy": {
224
+ "alias": " - anatomy",
225
+ "acc,none": 0.6370370370370371,
226
+ "acc_stderr,none": 0.04153948404742398
227
+ },
228
+ "mmlu_astronomy": {
229
+ "alias": " - astronomy",
230
+ "acc,none": 0.6842105263157895,
231
+ "acc_stderr,none": 0.0378272898086547
232
+ },
233
+ "mmlu_college_biology": {
234
+ "alias": " - college_biology",
235
+ "acc,none": 0.7222222222222222,
236
+ "acc_stderr,none": 0.03745554791462457
237
+ },
238
+ "mmlu_college_chemistry": {
239
+ "alias": " - college_chemistry",
240
+ "acc,none": 0.52,
241
+ "acc_stderr,none": 0.050211673156867795
242
+ },
243
+ "mmlu_college_computer_science": {
244
+ "alias": " - college_computer_science",
245
+ "acc,none": 0.53,
246
+ "acc_stderr,none": 0.050161355804659205
247
+ },
248
+ "mmlu_college_mathematics": {
249
+ "alias": " - college_mathematics",
250
+ "acc,none": 0.38,
251
+ "acc_stderr,none": 0.048783173121456316
252
+ },
253
+ "mmlu_college_physics": {
254
+ "alias": " - college_physics",
255
+ "acc,none": 0.4411764705882353,
256
+ "acc_stderr,none": 0.04940635630605659
257
+ },
258
+ "mmlu_computer_security": {
259
+ "alias": " - computer_security",
260
+ "acc,none": 0.72,
261
+ "acc_stderr,none": 0.045126085985421276
262
+ },
263
+ "mmlu_conceptual_physics": {
264
+ "alias": " - conceptual_physics",
265
+ "acc,none": 0.5872340425531914,
266
+ "acc_stderr,none": 0.03218471141400351
267
+ },
268
+ "mmlu_electrical_engineering": {
269
+ "alias": " - electrical_engineering",
270
+ "acc,none": 0.5448275862068965,
271
+ "acc_stderr,none": 0.04149886942192117
272
+ },
273
+ "mmlu_elementary_mathematics": {
274
+ "alias": " - elementary_mathematics",
275
+ "acc,none": 0.4126984126984127,
276
+ "acc_stderr,none": 0.025355741263055266
277
+ },
278
+ "mmlu_high_school_biology": {
279
+ "alias": " - high_school_biology",
280
+ "acc,none": 0.7645161290322581,
281
+ "acc_stderr,none": 0.024137632429337714
282
+ },
283
+ "mmlu_high_school_chemistry": {
284
+ "alias": " - high_school_chemistry",
285
+ "acc,none": 0.5270935960591133,
286
+ "acc_stderr,none": 0.03512819077876106
287
+ },
288
+ "mmlu_high_school_computer_science": {
289
+ "alias": " - high_school_computer_science",
290
+ "acc,none": 0.65,
291
+ "acc_stderr,none": 0.047937248544110196
292
+ },
293
+ "mmlu_high_school_mathematics": {
294
+ "alias": " - high_school_mathematics",
295
+ "acc,none": 0.37777777777777777,
296
+ "acc_stderr,none": 0.029560707392465718
297
+ },
298
+ "mmlu_high_school_physics": {
299
+ "alias": " - high_school_physics",
300
+ "acc,none": 0.3443708609271523,
301
+ "acc_stderr,none": 0.038796870240733264
302
+ },
303
+ "mmlu_high_school_statistics": {
304
+ "alias": " - high_school_statistics",
305
+ "acc,none": 0.5185185185185185,
306
+ "acc_stderr,none": 0.03407632093854051
307
+ },
308
+ "mmlu_machine_learning": {
309
+ "alias": " - machine_learning",
310
+ "acc,none": 0.44642857142857145,
311
+ "acc_stderr,none": 0.047184714852195886
312
+ }
313
+ },
314
+ "groups": {
315
+ "mmlu": {
316
+ "acc,none": 0.6290414470873095,
317
+ "acc_stderr,none": 0.0038616186138953307,
318
+ "alias": "mmlu"
319
+ },
320
+ "mmlu_humanities": {
321
+ "alias": " - humanities",
322
+ "acc,none": 0.5827842720510096,
323
+ "acc_stderr,none": 0.006787513340184287
324
+ },
325
+ "mmlu_other": {
326
+ "alias": " - other",
327
+ "acc,none": 0.6961699388477631,
328
+ "acc_stderr,none": 0.00794157385703638
329
+ },
330
+ "mmlu_social_sciences": {
331
+ "alias": " - social_sciences",
332
+ "acc,none": 0.7312317192070198,
333
+ "acc_stderr,none": 0.007850188394169316
334
+ },
335
+ "mmlu_stem": {
336
+ "alias": " - stem",
337
+ "acc,none": 0.5321915635902316,
338
+ "acc_stderr,none": 0.008558424855038367
339
+ }
340
+ },
341
+ "configs": {
342
+ "mmlu_abstract_algebra": {
343
+ "task": "mmlu_abstract_algebra",
344
+ "task_alias": "abstract_algebra",
345
+ "group": "mmlu_stem",
346
+ "group_alias": "stem",
347
+ "dataset_path": "hails/mmlu_no_train",
348
+ "dataset_name": "abstract_algebra",
349
+ "test_split": "test",
350
+ "fewshot_split": "dev",
351
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
352
+ "doc_to_target": "answer",
353
+ "doc_to_choice": [
354
+ "A",
355
+ "B",
356
+ "C",
357
+ "D"
358
+ ],
359
+ "description": "The following are multiple choice questions (with answers) about abstract algebra.\n\n",
360
+ "target_delimiter": " ",
361
+ "fewshot_delimiter": "\n\n",
362
+ "fewshot_config": {
363
+ "sampler": "first_n"
364
+ },
365
+ "num_fewshot": 5,
366
+ "metric_list": [
367
+ {
368
+ "metric": "acc",
369
+ "aggregation": "mean",
370
+ "higher_is_better": true
371
+ }
372
+ ],
373
+ "output_type": "multiple_choice",
374
+ "repeats": 1,
375
+ "should_decontaminate": false,
376
+ "metadata": {
377
+ "version": 0.0
378
+ }
379
+ },
380
+ "mmlu_anatomy": {
381
+ "task": "mmlu_anatomy",
382
+ "task_alias": "anatomy",
383
+ "group": "mmlu_stem",
384
+ "group_alias": "stem",
385
+ "dataset_path": "hails/mmlu_no_train",
386
+ "dataset_name": "anatomy",
387
+ "test_split": "test",
388
+ "fewshot_split": "dev",
389
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
390
+ "doc_to_target": "answer",
391
+ "doc_to_choice": [
392
+ "A",
393
+ "B",
394
+ "C",
395
+ "D"
396
+ ],
397
+ "description": "The following are multiple choice questions (with answers) about anatomy.\n\n",
398
+ "target_delimiter": " ",
399
+ "fewshot_delimiter": "\n\n",
400
+ "fewshot_config": {
401
+ "sampler": "first_n"
402
+ },
403
+ "num_fewshot": 5,
404
+ "metric_list": [
405
+ {
406
+ "metric": "acc",
407
+ "aggregation": "mean",
408
+ "higher_is_better": true
409
+ }
410
+ ],
411
+ "output_type": "multiple_choice",
412
+ "repeats": 1,
413
+ "should_decontaminate": false,
414
+ "metadata": {
415
+ "version": 0.0
416
+ }
417
+ },
418
+ "mmlu_astronomy": {
419
+ "task": "mmlu_astronomy",
420
+ "task_alias": "astronomy",
421
+ "group": "mmlu_stem",
422
+ "group_alias": "stem",
423
+ "dataset_path": "hails/mmlu_no_train",
424
+ "dataset_name": "astronomy",
425
+ "test_split": "test",
426
+ "fewshot_split": "dev",
427
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
428
+ "doc_to_target": "answer",
429
+ "doc_to_choice": [
430
+ "A",
431
+ "B",
432
+ "C",
433
+ "D"
434
+ ],
435
+ "description": "The following are multiple choice questions (with answers) about astronomy.\n\n",
436
+ "target_delimiter": " ",
437
+ "fewshot_delimiter": "\n\n",
438
+ "fewshot_config": {
439
+ "sampler": "first_n"
440
+ },
441
+ "num_fewshot": 5,
442
+ "metric_list": [
443
+ {
444
+ "metric": "acc",
445
+ "aggregation": "mean",
446
+ "higher_is_better": true
447
+ }
448
+ ],
449
+ "output_type": "multiple_choice",
450
+ "repeats": 1,
451
+ "should_decontaminate": false,
452
+ "metadata": {
453
+ "version": 0.0
454
+ }
455
+ },
456
+ "mmlu_business_ethics": {
457
+ "task": "mmlu_business_ethics",
458
+ "task_alias": "business_ethics",
459
+ "group": "mmlu_other",
460
+ "group_alias": "other",
461
+ "dataset_path": "hails/mmlu_no_train",
462
+ "dataset_name": "business_ethics",
463
+ "test_split": "test",
464
+ "fewshot_split": "dev",
465
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
466
+ "doc_to_target": "answer",
467
+ "doc_to_choice": [
468
+ "A",
469
+ "B",
470
+ "C",
471
+ "D"
472
+ ],
473
+ "description": "The following are multiple choice questions (with answers) about business ethics.\n\n",
474
+ "target_delimiter": " ",
475
+ "fewshot_delimiter": "\n\n",
476
+ "fewshot_config": {
477
+ "sampler": "first_n"
478
+ },
479
+ "num_fewshot": 5,
480
+ "metric_list": [
481
+ {
482
+ "metric": "acc",
483
+ "aggregation": "mean",
484
+ "higher_is_better": true
485
+ }
486
+ ],
487
+ "output_type": "multiple_choice",
488
+ "repeats": 1,
489
+ "should_decontaminate": false,
490
+ "metadata": {
491
+ "version": 0.0
492
+ }
493
+ },
494
+ "mmlu_clinical_knowledge": {
495
+ "task": "mmlu_clinical_knowledge",
496
+ "task_alias": "clinical_knowledge",
497
+ "group": "mmlu_other",
498
+ "group_alias": "other",
499
+ "dataset_path": "hails/mmlu_no_train",
500
+ "dataset_name": "clinical_knowledge",
501
+ "test_split": "test",
502
+ "fewshot_split": "dev",
503
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
504
+ "doc_to_target": "answer",
505
+ "doc_to_choice": [
506
+ "A",
507
+ "B",
508
+ "C",
509
+ "D"
510
+ ],
511
+ "description": "The following are multiple choice questions (with answers) about clinical knowledge.\n\n",
512
+ "target_delimiter": " ",
513
+ "fewshot_delimiter": "\n\n",
514
+ "fewshot_config": {
515
+ "sampler": "first_n"
516
+ },
517
+ "num_fewshot": 5,
518
+ "metric_list": [
519
+ {
520
+ "metric": "acc",
521
+ "aggregation": "mean",
522
+ "higher_is_better": true
523
+ }
524
+ ],
525
+ "output_type": "multiple_choice",
526
+ "repeats": 1,
527
+ "should_decontaminate": false,
528
+ "metadata": {
529
+ "version": 0.0
530
+ }
531
+ },
532
+ "mmlu_college_biology": {
533
+ "task": "mmlu_college_biology",
534
+ "task_alias": "college_biology",
535
+ "group": "mmlu_stem",
536
+ "group_alias": "stem",
537
+ "dataset_path": "hails/mmlu_no_train",
538
+ "dataset_name": "college_biology",
539
+ "test_split": "test",
540
+ "fewshot_split": "dev",
541
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
542
+ "doc_to_target": "answer",
543
+ "doc_to_choice": [
544
+ "A",
545
+ "B",
546
+ "C",
547
+ "D"
548
+ ],
549
+ "description": "The following are multiple choice questions (with answers) about college biology.\n\n",
550
+ "target_delimiter": " ",
551
+ "fewshot_delimiter": "\n\n",
552
+ "fewshot_config": {
553
+ "sampler": "first_n"
554
+ },
555
+ "num_fewshot": 5,
556
+ "metric_list": [
557
+ {
558
+ "metric": "acc",
559
+ "aggregation": "mean",
560
+ "higher_is_better": true
561
+ }
562
+ ],
563
+ "output_type": "multiple_choice",
564
+ "repeats": 1,
565
+ "should_decontaminate": false,
566
+ "metadata": {
567
+ "version": 0.0
568
+ }
569
+ },
570
+ "mmlu_college_chemistry": {
571
+ "task": "mmlu_college_chemistry",
572
+ "task_alias": "college_chemistry",
573
+ "group": "mmlu_stem",
574
+ "group_alias": "stem",
575
+ "dataset_path": "hails/mmlu_no_train",
576
+ "dataset_name": "college_chemistry",
577
+ "test_split": "test",
578
+ "fewshot_split": "dev",
579
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
580
+ "doc_to_target": "answer",
581
+ "doc_to_choice": [
582
+ "A",
583
+ "B",
584
+ "C",
585
+ "D"
586
+ ],
587
+ "description": "The following are multiple choice questions (with answers) about college chemistry.\n\n",
588
+ "target_delimiter": " ",
589
+ "fewshot_delimiter": "\n\n",
590
+ "fewshot_config": {
591
+ "sampler": "first_n"
592
+ },
593
+ "num_fewshot": 5,
594
+ "metric_list": [
595
+ {
596
+ "metric": "acc",
597
+ "aggregation": "mean",
598
+ "higher_is_better": true
599
+ }
600
+ ],
601
+ "output_type": "multiple_choice",
602
+ "repeats": 1,
603
+ "should_decontaminate": false,
604
+ "metadata": {
605
+ "version": 0.0
606
+ }
607
+ },
608
+ "mmlu_college_computer_science": {
609
+ "task": "mmlu_college_computer_science",
610
+ "task_alias": "college_computer_science",
611
+ "group": "mmlu_stem",
612
+ "group_alias": "stem",
613
+ "dataset_path": "hails/mmlu_no_train",
614
+ "dataset_name": "college_computer_science",
615
+ "test_split": "test",
616
+ "fewshot_split": "dev",
617
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
618
+ "doc_to_target": "answer",
619
+ "doc_to_choice": [
620
+ "A",
621
+ "B",
622
+ "C",
623
+ "D"
624
+ ],
625
+ "description": "The following are multiple choice questions (with answers) about college computer science.\n\n",
626
+ "target_delimiter": " ",
627
+ "fewshot_delimiter": "\n\n",
628
+ "fewshot_config": {
629
+ "sampler": "first_n"
630
+ },
631
+ "num_fewshot": 5,
632
+ "metric_list": [
633
+ {
634
+ "metric": "acc",
635
+ "aggregation": "mean",
636
+ "higher_is_better": true
637
+ }
638
+ ],
639
+ "output_type": "multiple_choice",
640
+ "repeats": 1,
641
+ "should_decontaminate": false,
642
+ "metadata": {
643
+ "version": 0.0
644
+ }
645
+ },
646
+ "mmlu_college_mathematics": {
647
+ "task": "mmlu_college_mathematics",
648
+ "task_alias": "college_mathematics",
649
+ "group": "mmlu_stem",
650
+ "group_alias": "stem",
651
+ "dataset_path": "hails/mmlu_no_train",
652
+ "dataset_name": "college_mathematics",
653
+ "test_split": "test",
654
+ "fewshot_split": "dev",
655
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
656
+ "doc_to_target": "answer",
657
+ "doc_to_choice": [
658
+ "A",
659
+ "B",
660
+ "C",
661
+ "D"
662
+ ],
663
+ "description": "The following are multiple choice questions (with answers) about college mathematics.\n\n",
664
+ "target_delimiter": " ",
665
+ "fewshot_delimiter": "\n\n",
666
+ "fewshot_config": {
667
+ "sampler": "first_n"
668
+ },
669
+ "num_fewshot": 5,
670
+ "metric_list": [
671
+ {
672
+ "metric": "acc",
673
+ "aggregation": "mean",
674
+ "higher_is_better": true
675
+ }
676
+ ],
677
+ "output_type": "multiple_choice",
678
+ "repeats": 1,
679
+ "should_decontaminate": false,
680
+ "metadata": {
681
+ "version": 0.0
682
+ }
683
+ },
684
+ "mmlu_college_medicine": {
685
+ "task": "mmlu_college_medicine",
686
+ "task_alias": "college_medicine",
687
+ "group": "mmlu_other",
688
+ "group_alias": "other",
689
+ "dataset_path": "hails/mmlu_no_train",
690
+ "dataset_name": "college_medicine",
691
+ "test_split": "test",
692
+ "fewshot_split": "dev",
693
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
694
+ "doc_to_target": "answer",
695
+ "doc_to_choice": [
696
+ "A",
697
+ "B",
698
+ "C",
699
+ "D"
700
+ ],
701
+ "description": "The following are multiple choice questions (with answers) about college medicine.\n\n",
702
+ "target_delimiter": " ",
703
+ "fewshot_delimiter": "\n\n",
704
+ "fewshot_config": {
705
+ "sampler": "first_n"
706
+ },
707
+ "num_fewshot": 5,
708
+ "metric_list": [
709
+ {
710
+ "metric": "acc",
711
+ "aggregation": "mean",
712
+ "higher_is_better": true
713
+ }
714
+ ],
715
+ "output_type": "multiple_choice",
716
+ "repeats": 1,
717
+ "should_decontaminate": false,
718
+ "metadata": {
719
+ "version": 0.0
720
+ }
721
+ },
722
+ "mmlu_college_physics": {
723
+ "task": "mmlu_college_physics",
724
+ "task_alias": "college_physics",
725
+ "group": "mmlu_stem",
726
+ "group_alias": "stem",
727
+ "dataset_path": "hails/mmlu_no_train",
728
+ "dataset_name": "college_physics",
729
+ "test_split": "test",
730
+ "fewshot_split": "dev",
731
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
732
+ "doc_to_target": "answer",
733
+ "doc_to_choice": [
734
+ "A",
735
+ "B",
736
+ "C",
737
+ "D"
738
+ ],
739
+ "description": "The following are multiple choice questions (with answers) about college physics.\n\n",
740
+ "target_delimiter": " ",
741
+ "fewshot_delimiter": "\n\n",
742
+ "fewshot_config": {
743
+ "sampler": "first_n"
744
+ },
745
+ "num_fewshot": 5,
746
+ "metric_list": [
747
+ {
748
+ "metric": "acc",
749
+ "aggregation": "mean",
750
+ "higher_is_better": true
751
+ }
752
+ ],
753
+ "output_type": "multiple_choice",
754
+ "repeats": 1,
755
+ "should_decontaminate": false,
756
+ "metadata": {
757
+ "version": 0.0
758
+ }
759
+ },
760
+ "mmlu_computer_security": {
761
+ "task": "mmlu_computer_security",
762
+ "task_alias": "computer_security",
763
+ "group": "mmlu_stem",
764
+ "group_alias": "stem",
765
+ "dataset_path": "hails/mmlu_no_train",
766
+ "dataset_name": "computer_security",
767
+ "test_split": "test",
768
+ "fewshot_split": "dev",
769
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
770
+ "doc_to_target": "answer",
771
+ "doc_to_choice": [
772
+ "A",
773
+ "B",
774
+ "C",
775
+ "D"
776
+ ],
777
+ "description": "The following are multiple choice questions (with answers) about computer security.\n\n",
778
+ "target_delimiter": " ",
779
+ "fewshot_delimiter": "\n\n",
780
+ "fewshot_config": {
781
+ "sampler": "first_n"
782
+ },
783
+ "num_fewshot": 5,
784
+ "metric_list": [
785
+ {
786
+ "metric": "acc",
787
+ "aggregation": "mean",
788
+ "higher_is_better": true
789
+ }
790
+ ],
791
+ "output_type": "multiple_choice",
792
+ "repeats": 1,
793
+ "should_decontaminate": false,
794
+ "metadata": {
795
+ "version": 0.0
796
+ }
797
+ },
798
+ "mmlu_conceptual_physics": {
799
+ "task": "mmlu_conceptual_physics",
800
+ "task_alias": "conceptual_physics",
801
+ "group": "mmlu_stem",
802
+ "group_alias": "stem",
803
+ "dataset_path": "hails/mmlu_no_train",
804
+ "dataset_name": "conceptual_physics",
805
+ "test_split": "test",
806
+ "fewshot_split": "dev",
807
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
808
+ "doc_to_target": "answer",
809
+ "doc_to_choice": [
810
+ "A",
811
+ "B",
812
+ "C",
813
+ "D"
814
+ ],
815
+ "description": "The following are multiple choice questions (with answers) about conceptual physics.\n\n",
816
+ "target_delimiter": " ",
817
+ "fewshot_delimiter": "\n\n",
818
+ "fewshot_config": {
819
+ "sampler": "first_n"
820
+ },
821
+ "num_fewshot": 5,
822
+ "metric_list": [
823
+ {
824
+ "metric": "acc",
825
+ "aggregation": "mean",
826
+ "higher_is_better": true
827
+ }
828
+ ],
829
+ "output_type": "multiple_choice",
830
+ "repeats": 1,
831
+ "should_decontaminate": false,
832
+ "metadata": {
833
+ "version": 0.0
834
+ }
835
+ },
836
+ "mmlu_econometrics": {
837
+ "task": "mmlu_econometrics",
838
+ "task_alias": "econometrics",
839
+ "group": "mmlu_social_sciences",
840
+ "group_alias": "social_sciences",
841
+ "dataset_path": "hails/mmlu_no_train",
842
+ "dataset_name": "econometrics",
843
+ "test_split": "test",
844
+ "fewshot_split": "dev",
845
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
846
+ "doc_to_target": "answer",
847
+ "doc_to_choice": [
848
+ "A",
849
+ "B",
850
+ "C",
851
+ "D"
852
+ ],
853
+ "description": "The following are multiple choice questions (with answers) about econometrics.\n\n",
854
+ "target_delimiter": " ",
855
+ "fewshot_delimiter": "\n\n",
856
+ "fewshot_config": {
857
+ "sampler": "first_n"
858
+ },
859
+ "num_fewshot": 5,
860
+ "metric_list": [
861
+ {
862
+ "metric": "acc",
863
+ "aggregation": "mean",
864
+ "higher_is_better": true
865
+ }
866
+ ],
867
+ "output_type": "multiple_choice",
868
+ "repeats": 1,
869
+ "should_decontaminate": false,
870
+ "metadata": {
871
+ "version": 0.0
872
+ }
873
+ },
874
+ "mmlu_electrical_engineering": {
875
+ "task": "mmlu_electrical_engineering",
876
+ "task_alias": "electrical_engineering",
877
+ "group": "mmlu_stem",
878
+ "group_alias": "stem",
879
+ "dataset_path": "hails/mmlu_no_train",
880
+ "dataset_name": "electrical_engineering",
881
+ "test_split": "test",
882
+ "fewshot_split": "dev",
883
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
884
+ "doc_to_target": "answer",
885
+ "doc_to_choice": [
886
+ "A",
887
+ "B",
888
+ "C",
889
+ "D"
890
+ ],
891
+ "description": "The following are multiple choice questions (with answers) about electrical engineering.\n\n",
892
+ "target_delimiter": " ",
893
+ "fewshot_delimiter": "\n\n",
894
+ "fewshot_config": {
895
+ "sampler": "first_n"
896
+ },
897
+ "num_fewshot": 5,
898
+ "metric_list": [
899
+ {
900
+ "metric": "acc",
901
+ "aggregation": "mean",
902
+ "higher_is_better": true
903
+ }
904
+ ],
905
+ "output_type": "multiple_choice",
906
+ "repeats": 1,
907
+ "should_decontaminate": false,
908
+ "metadata": {
909
+ "version": 0.0
910
+ }
911
+ },
912
+ "mmlu_elementary_mathematics": {
913
+ "task": "mmlu_elementary_mathematics",
914
+ "task_alias": "elementary_mathematics",
915
+ "group": "mmlu_stem",
916
+ "group_alias": "stem",
917
+ "dataset_path": "hails/mmlu_no_train",
918
+ "dataset_name": "elementary_mathematics",
919
+ "test_split": "test",
920
+ "fewshot_split": "dev",
921
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
922
+ "doc_to_target": "answer",
923
+ "doc_to_choice": [
924
+ "A",
925
+ "B",
926
+ "C",
927
+ "D"
928
+ ],
929
+ "description": "The following are multiple choice questions (with answers) about elementary mathematics.\n\n",
930
+ "target_delimiter": " ",
931
+ "fewshot_delimiter": "\n\n",
932
+ "fewshot_config": {
933
+ "sampler": "first_n"
934
+ },
935
+ "num_fewshot": 5,
936
+ "metric_list": [
937
+ {
938
+ "metric": "acc",
939
+ "aggregation": "mean",
940
+ "higher_is_better": true
941
+ }
942
+ ],
943
+ "output_type": "multiple_choice",
944
+ "repeats": 1,
945
+ "should_decontaminate": false,
946
+ "metadata": {
947
+ "version": 0.0
948
+ }
949
+ },
950
+ "mmlu_formal_logic": {
951
+ "task": "mmlu_formal_logic",
952
+ "task_alias": "formal_logic",
953
+ "group": "mmlu_humanities",
954
+ "group_alias": "humanities",
955
+ "dataset_path": "hails/mmlu_no_train",
956
+ "dataset_name": "formal_logic",
957
+ "test_split": "test",
958
+ "fewshot_split": "dev",
959
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
960
+ "doc_to_target": "answer",
961
+ "doc_to_choice": [
962
+ "A",
963
+ "B",
964
+ "C",
965
+ "D"
966
+ ],
967
+ "description": "The following are multiple choice questions (with answers) about formal logic.\n\n",
968
+ "target_delimiter": " ",
969
+ "fewshot_delimiter": "\n\n",
970
+ "fewshot_config": {
971
+ "sampler": "first_n"
972
+ },
973
+ "num_fewshot": 5,
974
+ "metric_list": [
975
+ {
976
+ "metric": "acc",
977
+ "aggregation": "mean",
978
+ "higher_is_better": true
979
+ }
980
+ ],
981
+ "output_type": "multiple_choice",
982
+ "repeats": 1,
983
+ "should_decontaminate": false,
984
+ "metadata": {
985
+ "version": 0.0
986
+ }
987
+ },
988
+ "mmlu_global_facts": {
989
+ "task": "mmlu_global_facts",
990
+ "task_alias": "global_facts",
991
+ "group": "mmlu_other",
992
+ "group_alias": "other",
993
+ "dataset_path": "hails/mmlu_no_train",
994
+ "dataset_name": "global_facts",
995
+ "test_split": "test",
996
+ "fewshot_split": "dev",
997
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
998
+ "doc_to_target": "answer",
999
+ "doc_to_choice": [
1000
+ "A",
1001
+ "B",
1002
+ "C",
1003
+ "D"
1004
+ ],
1005
+ "description": "The following are multiple choice questions (with answers) about global facts.\n\n",
1006
+ "target_delimiter": " ",
1007
+ "fewshot_delimiter": "\n\n",
1008
+ "fewshot_config": {
1009
+ "sampler": "first_n"
1010
+ },
1011
+ "num_fewshot": 5,
1012
+ "metric_list": [
1013
+ {
1014
+ "metric": "acc",
1015
+ "aggregation": "mean",
1016
+ "higher_is_better": true
1017
+ }
1018
+ ],
1019
+ "output_type": "multiple_choice",
1020
+ "repeats": 1,
1021
+ "should_decontaminate": false,
1022
+ "metadata": {
1023
+ "version": 0.0
1024
+ }
1025
+ },
1026
+ "mmlu_high_school_biology": {
1027
+ "task": "mmlu_high_school_biology",
1028
+ "task_alias": "high_school_biology",
1029
+ "group": "mmlu_stem",
1030
+ "group_alias": "stem",
1031
+ "dataset_path": "hails/mmlu_no_train",
1032
+ "dataset_name": "high_school_biology",
1033
+ "test_split": "test",
1034
+ "fewshot_split": "dev",
1035
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
1036
+ "doc_to_target": "answer",
1037
+ "doc_to_choice": [
1038
+ "A",
1039
+ "B",
1040
+ "C",
1041
+ "D"
1042
+ ],
1043
+ "description": "The following are multiple choice questions (with answers) about high school biology.\n\n",
1044
+ "target_delimiter": " ",
1045
+ "fewshot_delimiter": "\n\n",
1046
+ "fewshot_config": {
1047
+ "sampler": "first_n"
1048
+ },
1049
+ "num_fewshot": 5,
1050
+ "metric_list": [
1051
+ {
1052
+ "metric": "acc",
1053
+ "aggregation": "mean",
1054
+ "higher_is_better": true
1055
+ }
1056
+ ],
1057
+ "output_type": "multiple_choice",
1058
+ "repeats": 1,
1059
+ "should_decontaminate": false,
1060
+ "metadata": {
1061
+ "version": 0.0
1062
+ }
1063
+ },
1064
+ "mmlu_high_school_chemistry": {
1065
+ "task": "mmlu_high_school_chemistry",
1066
+ "task_alias": "high_school_chemistry",
1067
+ "group": "mmlu_stem",
1068
+ "group_alias": "stem",
1069
+ "dataset_path": "hails/mmlu_no_train",
1070
+ "dataset_name": "high_school_chemistry",
1071
+ "test_split": "test",
1072
+ "fewshot_split": "dev",
1073
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
1074
+ "doc_to_target": "answer",
1075
+ "doc_to_choice": [
1076
+ "A",
1077
+ "B",
1078
+ "C",
1079
+ "D"
1080
+ ],
1081
+ "description": "The following are multiple choice questions (with answers) about high school chemistry.\n\n",
1082
+ "target_delimiter": " ",
1083
+ "fewshot_delimiter": "\n\n",
1084
+ "fewshot_config": {
1085
+ "sampler": "first_n"
1086
+ },
1087
+ "num_fewshot": 5,
1088
+ "metric_list": [
1089
+ {
1090
+ "metric": "acc",
1091
+ "aggregation": "mean",
1092
+ "higher_is_better": true
1093
+ }
1094
+ ],
1095
+ "output_type": "multiple_choice",
1096
+ "repeats": 1,
1097
+ "should_decontaminate": false,
1098
+ "metadata": {
1099
+ "version": 0.0
1100
+ }
1101
+ },
1102
+ "mmlu_high_school_computer_science": {
1103
+ "task": "mmlu_high_school_computer_science",
1104
+ "task_alias": "high_school_computer_science",
1105
+ "group": "mmlu_stem",
1106
+ "group_alias": "stem",
1107
+ "dataset_path": "hails/mmlu_no_train",
1108
+ "dataset_name": "high_school_computer_science",
1109
+ "test_split": "test",
1110
+ "fewshot_split": "dev",
1111
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
1112
+ "doc_to_target": "answer",
1113
+ "doc_to_choice": [
1114
+ "A",
1115
+ "B",
1116
+ "C",
1117
+ "D"
1118
+ ],
1119
+ "description": "The following are multiple choice questions (with answers) about high school computer science.\n\n",
1120
+ "target_delimiter": " ",
1121
+ "fewshot_delimiter": "\n\n",
1122
+ "fewshot_config": {
1123
+ "sampler": "first_n"
1124
+ },
1125
+ "num_fewshot": 5,
1126
+ "metric_list": [
1127
+ {
1128
+ "metric": "acc",
1129
+ "aggregation": "mean",
1130
+ "higher_is_better": true
1131
+ }
1132
+ ],
1133
+ "output_type": "multiple_choice",
1134
+ "repeats": 1,
1135
+ "should_decontaminate": false,
1136
+ "metadata": {
1137
+ "version": 0.0
1138
+ }
1139
+ },
1140
+ "mmlu_high_school_european_history": {
1141
+ "task": "mmlu_high_school_european_history",
1142
+ "task_alias": "high_school_european_history",
1143
+ "group": "mmlu_humanities",
1144
+ "group_alias": "humanities",
1145
+ "dataset_path": "hails/mmlu_no_train",
1146
+ "dataset_name": "high_school_european_history",
1147
+ "test_split": "test",
1148
+ "fewshot_split": "dev",
1149
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
1150
+ "doc_to_target": "answer",
1151
+ "doc_to_choice": [
1152
+ "A",
1153
+ "B",
1154
+ "C",
1155
+ "D"
1156
+ ],
1157
+ "description": "The following are multiple choice questions (with answers) about high school european history.\n\n",
1158
+ "target_delimiter": " ",
1159
+ "fewshot_delimiter": "\n\n",
1160
+ "fewshot_config": {
1161
+ "sampler": "first_n"
1162
+ },
1163
+ "num_fewshot": 5,
1164
+ "metric_list": [
1165
+ {
1166
+ "metric": "acc",
1167
+ "aggregation": "mean",
1168
+ "higher_is_better": true
1169
+ }
1170
+ ],
1171
+ "output_type": "multiple_choice",
1172
+ "repeats": 1,
1173
+ "should_decontaminate": false,
1174
+ "metadata": {
1175
+ "version": 0.0
1176
+ }
1177
+ },
1178
+ "mmlu_high_school_geography": {
1179
+ "task": "mmlu_high_school_geography",
1180
+ "task_alias": "high_school_geography",
1181
+ "group": "mmlu_social_sciences",
1182
+ "group_alias": "social_sciences",
1183
+ "dataset_path": "hails/mmlu_no_train",
1184
+ "dataset_name": "high_school_geography",
1185
+ "test_split": "test",
1186
+ "fewshot_split": "dev",
1187
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
1188
+ "doc_to_target": "answer",
1189
+ "doc_to_choice": [
1190
+ "A",
1191
+ "B",
1192
+ "C",
1193
+ "D"
1194
+ ],
1195
+ "description": "The following are multiple choice questions (with answers) about high school geography.\n\n",
1196
+ "target_delimiter": " ",
1197
+ "fewshot_delimiter": "\n\n",
1198
+ "fewshot_config": {
1199
+ "sampler": "first_n"
1200
+ },
1201
+ "num_fewshot": 5,
1202
+ "metric_list": [
1203
+ {
1204
+ "metric": "acc",
1205
+ "aggregation": "mean",
1206
+ "higher_is_better": true
1207
+ }
1208
+ ],
1209
+ "output_type": "multiple_choice",
1210
+ "repeats": 1,
1211
+ "should_decontaminate": false,
1212
+ "metadata": {
1213
+ "version": 0.0
1214
+ }
1215
+ },
1216
+ "mmlu_high_school_government_and_politics": {
1217
+ "task": "mmlu_high_school_government_and_politics",
1218
+ "task_alias": "high_school_government_and_politics",
1219
+ "group": "mmlu_social_sciences",
1220
+ "group_alias": "social_sciences",
1221
+ "dataset_path": "hails/mmlu_no_train",
1222
+ "dataset_name": "high_school_government_and_politics",
1223
+ "test_split": "test",
1224
+ "fewshot_split": "dev",
1225
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
1226
+ "doc_to_target": "answer",
1227
+ "doc_to_choice": [
1228
+ "A",
1229
+ "B",
1230
+ "C",
1231
+ "D"
1232
+ ],
1233
+ "description": "The following are multiple choice questions (with answers) about high school government and politics.\n\n",
1234
+ "target_delimiter": " ",
1235
+ "fewshot_delimiter": "\n\n",
1236
+ "fewshot_config": {
1237
+ "sampler": "first_n"
1238
+ },
1239
+ "num_fewshot": 5,
1240
+ "metric_list": [
1241
+ {
1242
+ "metric": "acc",
1243
+ "aggregation": "mean",
1244
+ "higher_is_better": true
1245
+ }
1246
+ ],
1247
+ "output_type": "multiple_choice",
1248
+ "repeats": 1,
1249
+ "should_decontaminate": false,
1250
+ "metadata": {
1251
+ "version": 0.0
1252
+ }
1253
+ },
1254
+ "mmlu_high_school_macroeconomics": {
1255
+ "task": "mmlu_high_school_macroeconomics",
1256
+ "task_alias": "high_school_macroeconomics",
1257
+ "group": "mmlu_social_sciences",
1258
+ "group_alias": "social_sciences",
1259
+ "dataset_path": "hails/mmlu_no_train",
1260
+ "dataset_name": "high_school_macroeconomics",
1261
+ "test_split": "test",
1262
+ "fewshot_split": "dev",
1263
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
1264
+ "doc_to_target": "answer",
1265
+ "doc_to_choice": [
1266
+ "A",
1267
+ "B",
1268
+ "C",
1269
+ "D"
1270
+ ],
1271
+ "description": "The following are multiple choice questions (with answers) about high school macroeconomics.\n\n",
1272
+ "target_delimiter": " ",
1273
+ "fewshot_delimiter": "\n\n",
1274
+ "fewshot_config": {
1275
+ "sampler": "first_n"
1276
+ },
1277
+ "num_fewshot": 5,
1278
+ "metric_list": [
1279
+ {
1280
+ "metric": "acc",
1281
+ "aggregation": "mean",
1282
+ "higher_is_better": true
1283
+ }
1284
+ ],
1285
+ "output_type": "multiple_choice",
1286
+ "repeats": 1,
1287
+ "should_decontaminate": false,
1288
+ "metadata": {
1289
+ "version": 0.0
1290
+ }
1291
+ },
1292
+ "mmlu_high_school_mathematics": {
1293
+ "task": "mmlu_high_school_mathematics",
1294
+ "task_alias": "high_school_mathematics",
1295
+ "group": "mmlu_stem",
1296
+ "group_alias": "stem",
1297
+ "dataset_path": "hails/mmlu_no_train",
1298
+ "dataset_name": "high_school_mathematics",
1299
+ "test_split": "test",
1300
+ "fewshot_split": "dev",
1301
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
1302
+ "doc_to_target": "answer",
1303
+ "doc_to_choice": [
1304
+ "A",
1305
+ "B",
1306
+ "C",
1307
+ "D"
1308
+ ],
1309
+ "description": "The following are multiple choice questions (with answers) about high school mathematics.\n\n",
1310
+ "target_delimiter": " ",
1311
+ "fewshot_delimiter": "\n\n",
1312
+ "fewshot_config": {
1313
+ "sampler": "first_n"
1314
+ },
1315
+ "num_fewshot": 5,
1316
+ "metric_list": [
1317
+ {
1318
+ "metric": "acc",
1319
+ "aggregation": "mean",
1320
+ "higher_is_better": true
1321
+ }
1322
+ ],
1323
+ "output_type": "multiple_choice",
1324
+ "repeats": 1,
1325
+ "should_decontaminate": false,
1326
+ "metadata": {
1327
+ "version": 0.0
1328
+ }
1329
+ },
1330
+ "mmlu_high_school_microeconomics": {
1331
+ "task": "mmlu_high_school_microeconomics",
1332
+ "task_alias": "high_school_microeconomics",
1333
+ "group": "mmlu_social_sciences",
1334
+ "group_alias": "social_sciences",
1335
+ "dataset_path": "hails/mmlu_no_train",
1336
+ "dataset_name": "high_school_microeconomics",
1337
+ "test_split": "test",
1338
+ "fewshot_split": "dev",
1339
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
1340
+ "doc_to_target": "answer",
1341
+ "doc_to_choice": [
1342
+ "A",
1343
+ "B",
1344
+ "C",
1345
+ "D"
1346
+ ],
1347
+ "description": "The following are multiple choice questions (with answers) about high school microeconomics.\n\n",
1348
+ "target_delimiter": " ",
1349
+ "fewshot_delimiter": "\n\n",
1350
+ "fewshot_config": {
1351
+ "sampler": "first_n"
1352
+ },
1353
+ "num_fewshot": 5,
1354
+ "metric_list": [
1355
+ {
1356
+ "metric": "acc",
1357
+ "aggregation": "mean",
1358
+ "higher_is_better": true
1359
+ }
1360
+ ],
1361
+ "output_type": "multiple_choice",
1362
+ "repeats": 1,
1363
+ "should_decontaminate": false,
1364
+ "metadata": {
1365
+ "version": 0.0
1366
+ }
1367
+ },
1368
+ "mmlu_high_school_physics": {
1369
+ "task": "mmlu_high_school_physics",
1370
+ "task_alias": "high_school_physics",
1371
+ "group": "mmlu_stem",
1372
+ "group_alias": "stem",
1373
+ "dataset_path": "hails/mmlu_no_train",
1374
+ "dataset_name": "high_school_physics",
1375
+ "test_split": "test",
1376
+ "fewshot_split": "dev",
1377
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
1378
+ "doc_to_target": "answer",
1379
+ "doc_to_choice": [
1380
+ "A",
1381
+ "B",
1382
+ "C",
1383
+ "D"
1384
+ ],
1385
+ "description": "The following are multiple choice questions (with answers) about high school physics.\n\n",
1386
+ "target_delimiter": " ",
1387
+ "fewshot_delimiter": "\n\n",
1388
+ "fewshot_config": {
1389
+ "sampler": "first_n"
1390
+ },
1391
+ "num_fewshot": 5,
1392
+ "metric_list": [
1393
+ {
1394
+ "metric": "acc",
1395
+ "aggregation": "mean",
1396
+ "higher_is_better": true
1397
+ }
1398
+ ],
1399
+ "output_type": "multiple_choice",
1400
+ "repeats": 1,
1401
+ "should_decontaminate": false,
1402
+ "metadata": {
1403
+ "version": 0.0
1404
+ }
1405
+ },
1406
+ "mmlu_high_school_psychology": {
1407
+ "task": "mmlu_high_school_psychology",
1408
+ "task_alias": "high_school_psychology",
1409
+ "group": "mmlu_social_sciences",
1410
+ "group_alias": "social_sciences",
1411
+ "dataset_path": "hails/mmlu_no_train",
1412
+ "dataset_name": "high_school_psychology",
1413
+ "test_split": "test",
1414
+ "fewshot_split": "dev",
1415
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
1416
+ "doc_to_target": "answer",
1417
+ "doc_to_choice": [
1418
+ "A",
1419
+ "B",
1420
+ "C",
1421
+ "D"
1422
+ ],
1423
+ "description": "The following are multiple choice questions (with answers) about high school psychology.\n\n",
1424
+ "target_delimiter": " ",
1425
+ "fewshot_delimiter": "\n\n",
1426
+ "fewshot_config": {
1427
+ "sampler": "first_n"
1428
+ },
1429
+ "num_fewshot": 5,
1430
+ "metric_list": [
1431
+ {
1432
+ "metric": "acc",
1433
+ "aggregation": "mean",
1434
+ "higher_is_better": true
1435
+ }
1436
+ ],
1437
+ "output_type": "multiple_choice",
1438
+ "repeats": 1,
1439
+ "should_decontaminate": false,
1440
+ "metadata": {
1441
+ "version": 0.0
1442
+ }
1443
+ },
1444
+ "mmlu_high_school_statistics": {
1445
+ "task": "mmlu_high_school_statistics",
1446
+ "task_alias": "high_school_statistics",
1447
+ "group": "mmlu_stem",
1448
+ "group_alias": "stem",
1449
+ "dataset_path": "hails/mmlu_no_train",
1450
+ "dataset_name": "high_school_statistics",
1451
+ "test_split": "test",
1452
+ "fewshot_split": "dev",
1453
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
1454
+ "doc_to_target": "answer",
1455
+ "doc_to_choice": [
1456
+ "A",
1457
+ "B",
1458
+ "C",
1459
+ "D"
1460
+ ],
1461
+ "description": "The following are multiple choice questions (with answers) about high school statistics.\n\n",
1462
+ "target_delimiter": " ",
1463
+ "fewshot_delimiter": "\n\n",
1464
+ "fewshot_config": {
1465
+ "sampler": "first_n"
1466
+ },
1467
+ "num_fewshot": 5,
1468
+ "metric_list": [
1469
+ {
1470
+ "metric": "acc",
1471
+ "aggregation": "mean",
1472
+ "higher_is_better": true
1473
+ }
1474
+ ],
1475
+ "output_type": "multiple_choice",
1476
+ "repeats": 1,
1477
+ "should_decontaminate": false,
1478
+ "metadata": {
1479
+ "version": 0.0
1480
+ }
1481
+ },
1482
+ "mmlu_high_school_us_history": {
1483
+ "task": "mmlu_high_school_us_history",
1484
+ "task_alias": "high_school_us_history",
1485
+ "group": "mmlu_humanities",
1486
+ "group_alias": "humanities",
1487
+ "dataset_path": "hails/mmlu_no_train",
1488
+ "dataset_name": "high_school_us_history",
1489
+ "test_split": "test",
1490
+ "fewshot_split": "dev",
1491
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
1492
+ "doc_to_target": "answer",
1493
+ "doc_to_choice": [
1494
+ "A",
1495
+ "B",
1496
+ "C",
1497
+ "D"
1498
+ ],
1499
+ "description": "The following are multiple choice questions (with answers) about high school us history.\n\n",
1500
+ "target_delimiter": " ",
1501
+ "fewshot_delimiter": "\n\n",
1502
+ "fewshot_config": {
1503
+ "sampler": "first_n"
1504
+ },
1505
+ "num_fewshot": 5,
1506
+ "metric_list": [
1507
+ {
1508
+ "metric": "acc",
1509
+ "aggregation": "mean",
1510
+ "higher_is_better": true
1511
+ }
1512
+ ],
1513
+ "output_type": "multiple_choice",
1514
+ "repeats": 1,
1515
+ "should_decontaminate": false,
1516
+ "metadata": {
1517
+ "version": 0.0
1518
+ }
1519
+ },
1520
+ "mmlu_high_school_world_history": {
1521
+ "task": "mmlu_high_school_world_history",
1522
+ "task_alias": "high_school_world_history",
1523
+ "group": "mmlu_humanities",
1524
+ "group_alias": "humanities",
1525
+ "dataset_path": "hails/mmlu_no_train",
1526
+ "dataset_name": "high_school_world_history",
1527
+ "test_split": "test",
1528
+ "fewshot_split": "dev",
1529
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
1530
+ "doc_to_target": "answer",
1531
+ "doc_to_choice": [
1532
+ "A",
1533
+ "B",
1534
+ "C",
1535
+ "D"
1536
+ ],
1537
+ "description": "The following are multiple choice questions (with answers) about high school world history.\n\n",
1538
+ "target_delimiter": " ",
1539
+ "fewshot_delimiter": "\n\n",
1540
+ "fewshot_config": {
1541
+ "sampler": "first_n"
1542
+ },
1543
+ "num_fewshot": 5,
1544
+ "metric_list": [
1545
+ {
1546
+ "metric": "acc",
1547
+ "aggregation": "mean",
1548
+ "higher_is_better": true
1549
+ }
1550
+ ],
1551
+ "output_type": "multiple_choice",
1552
+ "repeats": 1,
1553
+ "should_decontaminate": false,
1554
+ "metadata": {
1555
+ "version": 0.0
1556
+ }
1557
+ },
1558
+ "mmlu_human_aging": {
1559
+ "task": "mmlu_human_aging",
1560
+ "task_alias": "human_aging",
1561
+ "group": "mmlu_other",
1562
+ "group_alias": "other",
1563
+ "dataset_path": "hails/mmlu_no_train",
1564
+ "dataset_name": "human_aging",
1565
+ "test_split": "test",
1566
+ "fewshot_split": "dev",
1567
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
1568
+ "doc_to_target": "answer",
1569
+ "doc_to_choice": [
1570
+ "A",
1571
+ "B",
1572
+ "C",
1573
+ "D"
1574
+ ],
1575
+ "description": "The following are multiple choice questions (with answers) about human aging.\n\n",
1576
+ "target_delimiter": " ",
1577
+ "fewshot_delimiter": "\n\n",
1578
+ "fewshot_config": {
1579
+ "sampler": "first_n"
1580
+ },
1581
+ "num_fewshot": 5,
1582
+ "metric_list": [
1583
+ {
1584
+ "metric": "acc",
1585
+ "aggregation": "mean",
1586
+ "higher_is_better": true
1587
+ }
1588
+ ],
1589
+ "output_type": "multiple_choice",
1590
+ "repeats": 1,
1591
+ "should_decontaminate": false,
1592
+ "metadata": {
1593
+ "version": 0.0
1594
+ }
1595
+ },
1596
+ "mmlu_human_sexuality": {
1597
+ "task": "mmlu_human_sexuality",
1598
+ "task_alias": "human_sexuality",
1599
+ "group": "mmlu_social_sciences",
1600
+ "group_alias": "social_sciences",
1601
+ "dataset_path": "hails/mmlu_no_train",
1602
+ "dataset_name": "human_sexuality",
1603
+ "test_split": "test",
1604
+ "fewshot_split": "dev",
1605
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
1606
+ "doc_to_target": "answer",
1607
+ "doc_to_choice": [
1608
+ "A",
1609
+ "B",
1610
+ "C",
1611
+ "D"
1612
+ ],
1613
+ "description": "The following are multiple choice questions (with answers) about human sexuality.\n\n",
1614
+ "target_delimiter": " ",
1615
+ "fewshot_delimiter": "\n\n",
1616
+ "fewshot_config": {
1617
+ "sampler": "first_n"
1618
+ },
1619
+ "num_fewshot": 5,
1620
+ "metric_list": [
1621
+ {
1622
+ "metric": "acc",
1623
+ "aggregation": "mean",
1624
+ "higher_is_better": true
1625
+ }
1626
+ ],
1627
+ "output_type": "multiple_choice",
1628
+ "repeats": 1,
1629
+ "should_decontaminate": false,
1630
+ "metadata": {
1631
+ "version": 0.0
1632
+ }
1633
+ },
1634
+ "mmlu_international_law": {
1635
+ "task": "mmlu_international_law",
1636
+ "task_alias": "international_law",
1637
+ "group": "mmlu_humanities",
1638
+ "group_alias": "humanities",
1639
+ "dataset_path": "hails/mmlu_no_train",
1640
+ "dataset_name": "international_law",
1641
+ "test_split": "test",
1642
+ "fewshot_split": "dev",
1643
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
1644
+ "doc_to_target": "answer",
1645
+ "doc_to_choice": [
1646
+ "A",
1647
+ "B",
1648
+ "C",
1649
+ "D"
1650
+ ],
1651
+ "description": "The following are multiple choice questions (with answers) about international law.\n\n",
1652
+ "target_delimiter": " ",
1653
+ "fewshot_delimiter": "\n\n",
1654
+ "fewshot_config": {
1655
+ "sampler": "first_n"
1656
+ },
1657
+ "num_fewshot": 5,
1658
+ "metric_list": [
1659
+ {
1660
+ "metric": "acc",
1661
+ "aggregation": "mean",
1662
+ "higher_is_better": true
1663
+ }
1664
+ ],
1665
+ "output_type": "multiple_choice",
1666
+ "repeats": 1,
1667
+ "should_decontaminate": false,
1668
+ "metadata": {
1669
+ "version": 0.0
1670
+ }
1671
+ },
1672
+ "mmlu_jurisprudence": {
1673
+ "task": "mmlu_jurisprudence",
1674
+ "task_alias": "jurisprudence",
1675
+ "group": "mmlu_humanities",
1676
+ "group_alias": "humanities",
1677
+ "dataset_path": "hails/mmlu_no_train",
1678
+ "dataset_name": "jurisprudence",
1679
+ "test_split": "test",
1680
+ "fewshot_split": "dev",
1681
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
1682
+ "doc_to_target": "answer",
1683
+ "doc_to_choice": [
1684
+ "A",
1685
+ "B",
1686
+ "C",
1687
+ "D"
1688
+ ],
1689
+ "description": "The following are multiple choice questions (with answers) about jurisprudence.\n\n",
1690
+ "target_delimiter": " ",
1691
+ "fewshot_delimiter": "\n\n",
1692
+ "fewshot_config": {
1693
+ "sampler": "first_n"
1694
+ },
1695
+ "num_fewshot": 5,
1696
+ "metric_list": [
1697
+ {
1698
+ "metric": "acc",
1699
+ "aggregation": "mean",
1700
+ "higher_is_better": true
1701
+ }
1702
+ ],
1703
+ "output_type": "multiple_choice",
1704
+ "repeats": 1,
1705
+ "should_decontaminate": false,
1706
+ "metadata": {
1707
+ "version": 0.0
1708
+ }
1709
+ },
1710
+ "mmlu_logical_fallacies": {
1711
+ "task": "mmlu_logical_fallacies",
1712
+ "task_alias": "logical_fallacies",
1713
+ "group": "mmlu_humanities",
1714
+ "group_alias": "humanities",
1715
+ "dataset_path": "hails/mmlu_no_train",
1716
+ "dataset_name": "logical_fallacies",
1717
+ "test_split": "test",
1718
+ "fewshot_split": "dev",
1719
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
1720
+ "doc_to_target": "answer",
1721
+ "doc_to_choice": [
1722
+ "A",
1723
+ "B",
1724
+ "C",
1725
+ "D"
1726
+ ],
1727
+ "description": "The following are multiple choice questions (with answers) about logical fallacies.\n\n",
1728
+ "target_delimiter": " ",
1729
+ "fewshot_delimiter": "\n\n",
1730
+ "fewshot_config": {
1731
+ "sampler": "first_n"
1732
+ },
1733
+ "num_fewshot": 5,
1734
+ "metric_list": [
1735
+ {
1736
+ "metric": "acc",
1737
+ "aggregation": "mean",
1738
+ "higher_is_better": true
1739
+ }
1740
+ ],
1741
+ "output_type": "multiple_choice",
1742
+ "repeats": 1,
1743
+ "should_decontaminate": false,
1744
+ "metadata": {
1745
+ "version": 0.0
1746
+ }
1747
+ },
1748
+ "mmlu_machine_learning": {
1749
+ "task": "mmlu_machine_learning",
1750
+ "task_alias": "machine_learning",
1751
+ "group": "mmlu_stem",
1752
+ "group_alias": "stem",
1753
+ "dataset_path": "hails/mmlu_no_train",
1754
+ "dataset_name": "machine_learning",
1755
+ "test_split": "test",
1756
+ "fewshot_split": "dev",
1757
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
1758
+ "doc_to_target": "answer",
1759
+ "doc_to_choice": [
1760
+ "A",
1761
+ "B",
1762
+ "C",
1763
+ "D"
1764
+ ],
1765
+ "description": "The following are multiple choice questions (with answers) about machine learning.\n\n",
1766
+ "target_delimiter": " ",
1767
+ "fewshot_delimiter": "\n\n",
1768
+ "fewshot_config": {
1769
+ "sampler": "first_n"
1770
+ },
1771
+ "num_fewshot": 5,
1772
+ "metric_list": [
1773
+ {
1774
+ "metric": "acc",
1775
+ "aggregation": "mean",
1776
+ "higher_is_better": true
1777
+ }
1778
+ ],
1779
+ "output_type": "multiple_choice",
1780
+ "repeats": 1,
1781
+ "should_decontaminate": false,
1782
+ "metadata": {
1783
+ "version": 0.0
1784
+ }
1785
+ },
1786
+ "mmlu_management": {
1787
+ "task": "mmlu_management",
1788
+ "task_alias": "management",
1789
+ "group": "mmlu_other",
1790
+ "group_alias": "other",
1791
+ "dataset_path": "hails/mmlu_no_train",
1792
+ "dataset_name": "management",
1793
+ "test_split": "test",
1794
+ "fewshot_split": "dev",
1795
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
1796
+ "doc_to_target": "answer",
1797
+ "doc_to_choice": [
1798
+ "A",
1799
+ "B",
1800
+ "C",
1801
+ "D"
1802
+ ],
1803
+ "description": "The following are multiple choice questions (with answers) about management.\n\n",
1804
+ "target_delimiter": " ",
1805
+ "fewshot_delimiter": "\n\n",
1806
+ "fewshot_config": {
1807
+ "sampler": "first_n"
1808
+ },
1809
+ "num_fewshot": 5,
1810
+ "metric_list": [
1811
+ {
1812
+ "metric": "acc",
1813
+ "aggregation": "mean",
1814
+ "higher_is_better": true
1815
+ }
1816
+ ],
1817
+ "output_type": "multiple_choice",
1818
+ "repeats": 1,
1819
+ "should_decontaminate": false,
1820
+ "metadata": {
1821
+ "version": 0.0
1822
+ }
1823
+ },
1824
+ "mmlu_marketing": {
1825
+ "task": "mmlu_marketing",
1826
+ "task_alias": "marketing",
1827
+ "group": "mmlu_other",
1828
+ "group_alias": "other",
1829
+ "dataset_path": "hails/mmlu_no_train",
1830
+ "dataset_name": "marketing",
1831
+ "test_split": "test",
1832
+ "fewshot_split": "dev",
1833
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
1834
+ "doc_to_target": "answer",
1835
+ "doc_to_choice": [
1836
+ "A",
1837
+ "B",
1838
+ "C",
1839
+ "D"
1840
+ ],
1841
+ "description": "The following are multiple choice questions (with answers) about marketing.\n\n",
1842
+ "target_delimiter": " ",
1843
+ "fewshot_delimiter": "\n\n",
1844
+ "fewshot_config": {
1845
+ "sampler": "first_n"
1846
+ },
1847
+ "num_fewshot": 5,
1848
+ "metric_list": [
1849
+ {
1850
+ "metric": "acc",
1851
+ "aggregation": "mean",
1852
+ "higher_is_better": true
1853
+ }
1854
+ ],
1855
+ "output_type": "multiple_choice",
1856
+ "repeats": 1,
1857
+ "should_decontaminate": false,
1858
+ "metadata": {
1859
+ "version": 0.0
1860
+ }
1861
+ },
1862
+ "mmlu_medical_genetics": {
1863
+ "task": "mmlu_medical_genetics",
1864
+ "task_alias": "medical_genetics",
1865
+ "group": "mmlu_other",
1866
+ "group_alias": "other",
1867
+ "dataset_path": "hails/mmlu_no_train",
1868
+ "dataset_name": "medical_genetics",
1869
+ "test_split": "test",
1870
+ "fewshot_split": "dev",
1871
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
1872
+ "doc_to_target": "answer",
1873
+ "doc_to_choice": [
1874
+ "A",
1875
+ "B",
1876
+ "C",
1877
+ "D"
1878
+ ],
1879
+ "description": "The following are multiple choice questions (with answers) about medical genetics.\n\n",
1880
+ "target_delimiter": " ",
1881
+ "fewshot_delimiter": "\n\n",
1882
+ "fewshot_config": {
1883
+ "sampler": "first_n"
1884
+ },
1885
+ "num_fewshot": 5,
1886
+ "metric_list": [
1887
+ {
1888
+ "metric": "acc",
1889
+ "aggregation": "mean",
1890
+ "higher_is_better": true
1891
+ }
1892
+ ],
1893
+ "output_type": "multiple_choice",
1894
+ "repeats": 1,
1895
+ "should_decontaminate": false,
1896
+ "metadata": {
1897
+ "version": 0.0
1898
+ }
1899
+ },
1900
+ "mmlu_miscellaneous": {
1901
+ "task": "mmlu_miscellaneous",
1902
+ "task_alias": "miscellaneous",
1903
+ "group": "mmlu_other",
1904
+ "group_alias": "other",
1905
+ "dataset_path": "hails/mmlu_no_train",
1906
+ "dataset_name": "miscellaneous",
1907
+ "test_split": "test",
1908
+ "fewshot_split": "dev",
1909
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
1910
+ "doc_to_target": "answer",
1911
+ "doc_to_choice": [
1912
+ "A",
1913
+ "B",
1914
+ "C",
1915
+ "D"
1916
+ ],
1917
+ "description": "The following are multiple choice questions (with answers) about miscellaneous.\n\n",
1918
+ "target_delimiter": " ",
1919
+ "fewshot_delimiter": "\n\n",
1920
+ "fewshot_config": {
1921
+ "sampler": "first_n"
1922
+ },
1923
+ "num_fewshot": 5,
1924
+ "metric_list": [
1925
+ {
1926
+ "metric": "acc",
1927
+ "aggregation": "mean",
1928
+ "higher_is_better": true
1929
+ }
1930
+ ],
1931
+ "output_type": "multiple_choice",
1932
+ "repeats": 1,
1933
+ "should_decontaminate": false,
1934
+ "metadata": {
1935
+ "version": 0.0
1936
+ }
1937
+ },
1938
+ "mmlu_moral_disputes": {
1939
+ "task": "mmlu_moral_disputes",
1940
+ "task_alias": "moral_disputes",
1941
+ "group": "mmlu_humanities",
1942
+ "group_alias": "humanities",
1943
+ "dataset_path": "hails/mmlu_no_train",
1944
+ "dataset_name": "moral_disputes",
1945
+ "test_split": "test",
1946
+ "fewshot_split": "dev",
1947
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
1948
+ "doc_to_target": "answer",
1949
+ "doc_to_choice": [
1950
+ "A",
1951
+ "B",
1952
+ "C",
1953
+ "D"
1954
+ ],
1955
+ "description": "The following are multiple choice questions (with answers) about moral disputes.\n\n",
1956
+ "target_delimiter": " ",
1957
+ "fewshot_delimiter": "\n\n",
1958
+ "fewshot_config": {
1959
+ "sampler": "first_n"
1960
+ },
1961
+ "num_fewshot": 5,
1962
+ "metric_list": [
1963
+ {
1964
+ "metric": "acc",
1965
+ "aggregation": "mean",
1966
+ "higher_is_better": true
1967
+ }
1968
+ ],
1969
+ "output_type": "multiple_choice",
1970
+ "repeats": 1,
1971
+ "should_decontaminate": false,
1972
+ "metadata": {
1973
+ "version": 0.0
1974
+ }
1975
+ },
1976
+ "mmlu_moral_scenarios": {
1977
+ "task": "mmlu_moral_scenarios",
1978
+ "task_alias": "moral_scenarios",
1979
+ "group": "mmlu_humanities",
1980
+ "group_alias": "humanities",
1981
+ "dataset_path": "hails/mmlu_no_train",
1982
+ "dataset_name": "moral_scenarios",
1983
+ "test_split": "test",
1984
+ "fewshot_split": "dev",
1985
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
1986
+ "doc_to_target": "answer",
1987
+ "doc_to_choice": [
1988
+ "A",
1989
+ "B",
1990
+ "C",
1991
+ "D"
1992
+ ],
1993
+ "description": "The following are multiple choice questions (with answers) about moral scenarios.\n\n",
1994
+ "target_delimiter": " ",
1995
+ "fewshot_delimiter": "\n\n",
1996
+ "fewshot_config": {
1997
+ "sampler": "first_n"
1998
+ },
1999
+ "num_fewshot": 5,
2000
+ "metric_list": [
2001
+ {
2002
+ "metric": "acc",
2003
+ "aggregation": "mean",
2004
+ "higher_is_better": true
2005
+ }
2006
+ ],
2007
+ "output_type": "multiple_choice",
2008
+ "repeats": 1,
2009
+ "should_decontaminate": false,
2010
+ "metadata": {
2011
+ "version": 0.0
2012
+ }
2013
+ },
2014
+ "mmlu_nutrition": {
2015
+ "task": "mmlu_nutrition",
2016
+ "task_alias": "nutrition",
2017
+ "group": "mmlu_other",
2018
+ "group_alias": "other",
2019
+ "dataset_path": "hails/mmlu_no_train",
2020
+ "dataset_name": "nutrition",
2021
+ "test_split": "test",
2022
+ "fewshot_split": "dev",
2023
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
2024
+ "doc_to_target": "answer",
2025
+ "doc_to_choice": [
2026
+ "A",
2027
+ "B",
2028
+ "C",
2029
+ "D"
2030
+ ],
2031
+ "description": "The following are multiple choice questions (with answers) about nutrition.\n\n",
2032
+ "target_delimiter": " ",
2033
+ "fewshot_delimiter": "\n\n",
2034
+ "fewshot_config": {
2035
+ "sampler": "first_n"
2036
+ },
2037
+ "num_fewshot": 5,
2038
+ "metric_list": [
2039
+ {
2040
+ "metric": "acc",
2041
+ "aggregation": "mean",
2042
+ "higher_is_better": true
2043
+ }
2044
+ ],
2045
+ "output_type": "multiple_choice",
2046
+ "repeats": 1,
2047
+ "should_decontaminate": false,
2048
+ "metadata": {
2049
+ "version": 0.0
2050
+ }
2051
+ },
2052
+ "mmlu_philosophy": {
2053
+ "task": "mmlu_philosophy",
2054
+ "task_alias": "philosophy",
2055
+ "group": "mmlu_humanities",
2056
+ "group_alias": "humanities",
2057
+ "dataset_path": "hails/mmlu_no_train",
2058
+ "dataset_name": "philosophy",
2059
+ "test_split": "test",
2060
+ "fewshot_split": "dev",
2061
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
2062
+ "doc_to_target": "answer",
2063
+ "doc_to_choice": [
2064
+ "A",
2065
+ "B",
2066
+ "C",
2067
+ "D"
2068
+ ],
2069
+ "description": "The following are multiple choice questions (with answers) about philosophy.\n\n",
2070
+ "target_delimiter": " ",
2071
+ "fewshot_delimiter": "\n\n",
2072
+ "fewshot_config": {
2073
+ "sampler": "first_n"
2074
+ },
2075
+ "num_fewshot": 5,
2076
+ "metric_list": [
2077
+ {
2078
+ "metric": "acc",
2079
+ "aggregation": "mean",
2080
+ "higher_is_better": true
2081
+ }
2082
+ ],
2083
+ "output_type": "multiple_choice",
2084
+ "repeats": 1,
2085
+ "should_decontaminate": false,
2086
+ "metadata": {
2087
+ "version": 0.0
2088
+ }
2089
+ },
2090
+ "mmlu_prehistory": {
2091
+ "task": "mmlu_prehistory",
2092
+ "task_alias": "prehistory",
2093
+ "group": "mmlu_humanities",
2094
+ "group_alias": "humanities",
2095
+ "dataset_path": "hails/mmlu_no_train",
2096
+ "dataset_name": "prehistory",
2097
+ "test_split": "test",
2098
+ "fewshot_split": "dev",
2099
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
2100
+ "doc_to_target": "answer",
2101
+ "doc_to_choice": [
2102
+ "A",
2103
+ "B",
2104
+ "C",
2105
+ "D"
2106
+ ],
2107
+ "description": "The following are multiple choice questions (with answers) about prehistory.\n\n",
2108
+ "target_delimiter": " ",
2109
+ "fewshot_delimiter": "\n\n",
2110
+ "fewshot_config": {
2111
+ "sampler": "first_n"
2112
+ },
2113
+ "num_fewshot": 5,
2114
+ "metric_list": [
2115
+ {
2116
+ "metric": "acc",
2117
+ "aggregation": "mean",
2118
+ "higher_is_better": true
2119
+ }
2120
+ ],
2121
+ "output_type": "multiple_choice",
2122
+ "repeats": 1,
2123
+ "should_decontaminate": false,
2124
+ "metadata": {
2125
+ "version": 0.0
2126
+ }
2127
+ },
2128
+ "mmlu_professional_accounting": {
2129
+ "task": "mmlu_professional_accounting",
2130
+ "task_alias": "professional_accounting",
2131
+ "group": "mmlu_other",
2132
+ "group_alias": "other",
2133
+ "dataset_path": "hails/mmlu_no_train",
2134
+ "dataset_name": "professional_accounting",
2135
+ "test_split": "test",
2136
+ "fewshot_split": "dev",
2137
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
2138
+ "doc_to_target": "answer",
2139
+ "doc_to_choice": [
2140
+ "A",
2141
+ "B",
2142
+ "C",
2143
+ "D"
2144
+ ],
2145
+ "description": "The following are multiple choice questions (with answers) about professional accounting.\n\n",
2146
+ "target_delimiter": " ",
2147
+ "fewshot_delimiter": "\n\n",
2148
+ "fewshot_config": {
2149
+ "sampler": "first_n"
2150
+ },
2151
+ "num_fewshot": 5,
2152
+ "metric_list": [
2153
+ {
2154
+ "metric": "acc",
2155
+ "aggregation": "mean",
2156
+ "higher_is_better": true
2157
+ }
2158
+ ],
2159
+ "output_type": "multiple_choice",
2160
+ "repeats": 1,
2161
+ "should_decontaminate": false,
2162
+ "metadata": {
2163
+ "version": 0.0
2164
+ }
2165
+ },
2166
+ "mmlu_professional_law": {
2167
+ "task": "mmlu_professional_law",
2168
+ "task_alias": "professional_law",
2169
+ "group": "mmlu_humanities",
2170
+ "group_alias": "humanities",
2171
+ "dataset_path": "hails/mmlu_no_train",
2172
+ "dataset_name": "professional_law",
2173
+ "test_split": "test",
2174
+ "fewshot_split": "dev",
2175
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
2176
+ "doc_to_target": "answer",
2177
+ "doc_to_choice": [
2178
+ "A",
2179
+ "B",
2180
+ "C",
2181
+ "D"
2182
+ ],
2183
+ "description": "The following are multiple choice questions (with answers) about professional law.\n\n",
2184
+ "target_delimiter": " ",
2185
+ "fewshot_delimiter": "\n\n",
2186
+ "fewshot_config": {
2187
+ "sampler": "first_n"
2188
+ },
2189
+ "num_fewshot": 5,
2190
+ "metric_list": [
2191
+ {
2192
+ "metric": "acc",
2193
+ "aggregation": "mean",
2194
+ "higher_is_better": true
2195
+ }
2196
+ ],
2197
+ "output_type": "multiple_choice",
2198
+ "repeats": 1,
2199
+ "should_decontaminate": false,
2200
+ "metadata": {
2201
+ "version": 0.0
2202
+ }
2203
+ },
2204
+ "mmlu_professional_medicine": {
2205
+ "task": "mmlu_professional_medicine",
2206
+ "task_alias": "professional_medicine",
2207
+ "group": "mmlu_other",
2208
+ "group_alias": "other",
2209
+ "dataset_path": "hails/mmlu_no_train",
2210
+ "dataset_name": "professional_medicine",
2211
+ "test_split": "test",
2212
+ "fewshot_split": "dev",
2213
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
2214
+ "doc_to_target": "answer",
2215
+ "doc_to_choice": [
2216
+ "A",
2217
+ "B",
2218
+ "C",
2219
+ "D"
2220
+ ],
2221
+ "description": "The following are multiple choice questions (with answers) about professional medicine.\n\n",
2222
+ "target_delimiter": " ",
2223
+ "fewshot_delimiter": "\n\n",
2224
+ "fewshot_config": {
2225
+ "sampler": "first_n"
2226
+ },
2227
+ "num_fewshot": 5,
2228
+ "metric_list": [
2229
+ {
2230
+ "metric": "acc",
2231
+ "aggregation": "mean",
2232
+ "higher_is_better": true
2233
+ }
2234
+ ],
2235
+ "output_type": "multiple_choice",
2236
+ "repeats": 1,
2237
+ "should_decontaminate": false,
2238
+ "metadata": {
2239
+ "version": 0.0
2240
+ }
2241
+ },
2242
+ "mmlu_professional_psychology": {
2243
+ "task": "mmlu_professional_psychology",
2244
+ "task_alias": "professional_psychology",
2245
+ "group": "mmlu_social_sciences",
2246
+ "group_alias": "social_sciences",
2247
+ "dataset_path": "hails/mmlu_no_train",
2248
+ "dataset_name": "professional_psychology",
2249
+ "test_split": "test",
2250
+ "fewshot_split": "dev",
2251
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
2252
+ "doc_to_target": "answer",
2253
+ "doc_to_choice": [
2254
+ "A",
2255
+ "B",
2256
+ "C",
2257
+ "D"
2258
+ ],
2259
+ "description": "The following are multiple choice questions (with answers) about professional psychology.\n\n",
2260
+ "target_delimiter": " ",
2261
+ "fewshot_delimiter": "\n\n",
2262
+ "fewshot_config": {
2263
+ "sampler": "first_n"
2264
+ },
2265
+ "num_fewshot": 5,
2266
+ "metric_list": [
2267
+ {
2268
+ "metric": "acc",
2269
+ "aggregation": "mean",
2270
+ "higher_is_better": true
2271
+ }
2272
+ ],
2273
+ "output_type": "multiple_choice",
2274
+ "repeats": 1,
2275
+ "should_decontaminate": false,
2276
+ "metadata": {
2277
+ "version": 0.0
2278
+ }
2279
+ },
2280
+ "mmlu_public_relations": {
2281
+ "task": "mmlu_public_relations",
2282
+ "task_alias": "public_relations",
2283
+ "group": "mmlu_social_sciences",
2284
+ "group_alias": "social_sciences",
2285
+ "dataset_path": "hails/mmlu_no_train",
2286
+ "dataset_name": "public_relations",
2287
+ "test_split": "test",
2288
+ "fewshot_split": "dev",
2289
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
2290
+ "doc_to_target": "answer",
2291
+ "doc_to_choice": [
2292
+ "A",
2293
+ "B",
2294
+ "C",
2295
+ "D"
2296
+ ],
2297
+ "description": "The following are multiple choice questions (with answers) about public relations.\n\n",
2298
+ "target_delimiter": " ",
2299
+ "fewshot_delimiter": "\n\n",
2300
+ "fewshot_config": {
2301
+ "sampler": "first_n"
2302
+ },
2303
+ "num_fewshot": 5,
2304
+ "metric_list": [
2305
+ {
2306
+ "metric": "acc",
2307
+ "aggregation": "mean",
2308
+ "higher_is_better": true
2309
+ }
2310
+ ],
2311
+ "output_type": "multiple_choice",
2312
+ "repeats": 1,
2313
+ "should_decontaminate": false,
2314
+ "metadata": {
2315
+ "version": 0.0
2316
+ }
2317
+ },
2318
+ "mmlu_security_studies": {
2319
+ "task": "mmlu_security_studies",
2320
+ "task_alias": "security_studies",
2321
+ "group": "mmlu_social_sciences",
2322
+ "group_alias": "social_sciences",
2323
+ "dataset_path": "hails/mmlu_no_train",
2324
+ "dataset_name": "security_studies",
2325
+ "test_split": "test",
2326
+ "fewshot_split": "dev",
2327
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
2328
+ "doc_to_target": "answer",
2329
+ "doc_to_choice": [
2330
+ "A",
2331
+ "B",
2332
+ "C",
2333
+ "D"
2334
+ ],
2335
+ "description": "The following are multiple choice questions (with answers) about security studies.\n\n",
2336
+ "target_delimiter": " ",
2337
+ "fewshot_delimiter": "\n\n",
2338
+ "fewshot_config": {
2339
+ "sampler": "first_n"
2340
+ },
2341
+ "num_fewshot": 5,
2342
+ "metric_list": [
2343
+ {
2344
+ "metric": "acc",
2345
+ "aggregation": "mean",
2346
+ "higher_is_better": true
2347
+ }
2348
+ ],
2349
+ "output_type": "multiple_choice",
2350
+ "repeats": 1,
2351
+ "should_decontaminate": false,
2352
+ "metadata": {
2353
+ "version": 0.0
2354
+ }
2355
+ },
2356
+ "mmlu_sociology": {
2357
+ "task": "mmlu_sociology",
2358
+ "task_alias": "sociology",
2359
+ "group": "mmlu_social_sciences",
2360
+ "group_alias": "social_sciences",
2361
+ "dataset_path": "hails/mmlu_no_train",
2362
+ "dataset_name": "sociology",
2363
+ "test_split": "test",
2364
+ "fewshot_split": "dev",
2365
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
2366
+ "doc_to_target": "answer",
2367
+ "doc_to_choice": [
2368
+ "A",
2369
+ "B",
2370
+ "C",
2371
+ "D"
2372
+ ],
2373
+ "description": "The following are multiple choice questions (with answers) about sociology.\n\n",
2374
+ "target_delimiter": " ",
2375
+ "fewshot_delimiter": "\n\n",
2376
+ "fewshot_config": {
2377
+ "sampler": "first_n"
2378
+ },
2379
+ "num_fewshot": 5,
2380
+ "metric_list": [
2381
+ {
2382
+ "metric": "acc",
2383
+ "aggregation": "mean",
2384
+ "higher_is_better": true
2385
+ }
2386
+ ],
2387
+ "output_type": "multiple_choice",
2388
+ "repeats": 1,
2389
+ "should_decontaminate": false,
2390
+ "metadata": {
2391
+ "version": 0.0
2392
+ }
2393
+ },
2394
+ "mmlu_us_foreign_policy": {
2395
+ "task": "mmlu_us_foreign_policy",
2396
+ "task_alias": "us_foreign_policy",
2397
+ "group": "mmlu_social_sciences",
2398
+ "group_alias": "social_sciences",
2399
+ "dataset_path": "hails/mmlu_no_train",
2400
+ "dataset_name": "us_foreign_policy",
2401
+ "test_split": "test",
2402
+ "fewshot_split": "dev",
2403
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
2404
+ "doc_to_target": "answer",
2405
+ "doc_to_choice": [
2406
+ "A",
2407
+ "B",
2408
+ "C",
2409
+ "D"
2410
+ ],
2411
+ "description": "The following are multiple choice questions (with answers) about us foreign policy.\n\n",
2412
+ "target_delimiter": " ",
2413
+ "fewshot_delimiter": "\n\n",
2414
+ "fewshot_config": {
2415
+ "sampler": "first_n"
2416
+ },
2417
+ "num_fewshot": 5,
2418
+ "metric_list": [
2419
+ {
2420
+ "metric": "acc",
2421
+ "aggregation": "mean",
2422
+ "higher_is_better": true
2423
+ }
2424
+ ],
2425
+ "output_type": "multiple_choice",
2426
+ "repeats": 1,
2427
+ "should_decontaminate": false,
2428
+ "metadata": {
2429
+ "version": 0.0
2430
+ }
2431
+ },
2432
+ "mmlu_virology": {
2433
+ "task": "mmlu_virology",
2434
+ "task_alias": "virology",
2435
+ "group": "mmlu_other",
2436
+ "group_alias": "other",
2437
+ "dataset_path": "hails/mmlu_no_train",
2438
+ "dataset_name": "virology",
2439
+ "test_split": "test",
2440
+ "fewshot_split": "dev",
2441
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
2442
+ "doc_to_target": "answer",
2443
+ "doc_to_choice": [
2444
+ "A",
2445
+ "B",
2446
+ "C",
2447
+ "D"
2448
+ ],
2449
+ "description": "The following are multiple choice questions (with answers) about virology.\n\n",
2450
+ "target_delimiter": " ",
2451
+ "fewshot_delimiter": "\n\n",
2452
+ "fewshot_config": {
2453
+ "sampler": "first_n"
2454
+ },
2455
+ "num_fewshot": 5,
2456
+ "metric_list": [
2457
+ {
2458
+ "metric": "acc",
2459
+ "aggregation": "mean",
2460
+ "higher_is_better": true
2461
+ }
2462
+ ],
2463
+ "output_type": "multiple_choice",
2464
+ "repeats": 1,
2465
+ "should_decontaminate": false,
2466
+ "metadata": {
2467
+ "version": 0.0
2468
+ }
2469
+ },
2470
+ "mmlu_world_religions": {
2471
+ "task": "mmlu_world_religions",
2472
+ "task_alias": "world_religions",
2473
+ "group": "mmlu_humanities",
2474
+ "group_alias": "humanities",
2475
+ "dataset_path": "hails/mmlu_no_train",
2476
+ "dataset_name": "world_religions",
2477
+ "test_split": "test",
2478
+ "fewshot_split": "dev",
2479
+ "doc_to_text": "{{question.strip()}}\nA. {{choices[0]}}\nB. {{choices[1]}}\nC. {{choices[2]}}\nD. {{choices[3]}}\nAnswer:",
2480
+ "doc_to_target": "answer",
2481
+ "doc_to_choice": [
2482
+ "A",
2483
+ "B",
2484
+ "C",
2485
+ "D"
2486
+ ],
2487
+ "description": "The following are multiple choice questions (with answers) about world religions.\n\n",
2488
+ "target_delimiter": " ",
2489
+ "fewshot_delimiter": "\n\n",
2490
+ "fewshot_config": {
2491
+ "sampler": "first_n"
2492
+ },
2493
+ "num_fewshot": 5,
2494
+ "metric_list": [
2495
+ {
2496
+ "metric": "acc",
2497
+ "aggregation": "mean",
2498
+ "higher_is_better": true
2499
+ }
2500
+ ],
2501
+ "output_type": "multiple_choice",
2502
+ "repeats": 1,
2503
+ "should_decontaminate": false,
2504
+ "metadata": {
2505
+ "version": 0.0
2506
+ }
2507
+ }
2508
+ },
2509
+ "versions": {
2510
+ "mmlu": "N/A",
2511
+ "mmlu_abstract_algebra": 0.0,
2512
+ "mmlu_anatomy": 0.0,
2513
+ "mmlu_astronomy": 0.0,
2514
+ "mmlu_business_ethics": 0.0,
2515
+ "mmlu_clinical_knowledge": 0.0,
2516
+ "mmlu_college_biology": 0.0,
2517
+ "mmlu_college_chemistry": 0.0,
2518
+ "mmlu_college_computer_science": 0.0,
2519
+ "mmlu_college_mathematics": 0.0,
2520
+ "mmlu_college_medicine": 0.0,
2521
+ "mmlu_college_physics": 0.0,
2522
+ "mmlu_computer_security": 0.0,
2523
+ "mmlu_conceptual_physics": 0.0,
2524
+ "mmlu_econometrics": 0.0,
2525
+ "mmlu_electrical_engineering": 0.0,
2526
+ "mmlu_elementary_mathematics": 0.0,
2527
+ "mmlu_formal_logic": 0.0,
2528
+ "mmlu_global_facts": 0.0,
2529
+ "mmlu_high_school_biology": 0.0,
2530
+ "mmlu_high_school_chemistry": 0.0,
2531
+ "mmlu_high_school_computer_science": 0.0,
2532
+ "mmlu_high_school_european_history": 0.0,
2533
+ "mmlu_high_school_geography": 0.0,
2534
+ "mmlu_high_school_government_and_politics": 0.0,
2535
+ "mmlu_high_school_macroeconomics": 0.0,
2536
+ "mmlu_high_school_mathematics": 0.0,
2537
+ "mmlu_high_school_microeconomics": 0.0,
2538
+ "mmlu_high_school_physics": 0.0,
2539
+ "mmlu_high_school_psychology": 0.0,
2540
+ "mmlu_high_school_statistics": 0.0,
2541
+ "mmlu_high_school_us_history": 0.0,
2542
+ "mmlu_high_school_world_history": 0.0,
2543
+ "mmlu_human_aging": 0.0,
2544
+ "mmlu_human_sexuality": 0.0,
2545
+ "mmlu_humanities": "N/A",
2546
+ "mmlu_international_law": 0.0,
2547
+ "mmlu_jurisprudence": 0.0,
2548
+ "mmlu_logical_fallacies": 0.0,
2549
+ "mmlu_machine_learning": 0.0,
2550
+ "mmlu_management": 0.0,
2551
+ "mmlu_marketing": 0.0,
2552
+ "mmlu_medical_genetics": 0.0,
2553
+ "mmlu_miscellaneous": 0.0,
2554
+ "mmlu_moral_disputes": 0.0,
2555
+ "mmlu_moral_scenarios": 0.0,
2556
+ "mmlu_nutrition": 0.0,
2557
+ "mmlu_other": "N/A",
2558
+ "mmlu_philosophy": 0.0,
2559
+ "mmlu_prehistory": 0.0,
2560
+ "mmlu_professional_accounting": 0.0,
2561
+ "mmlu_professional_law": 0.0,
2562
+ "mmlu_professional_medicine": 0.0,
2563
+ "mmlu_professional_psychology": 0.0,
2564
+ "mmlu_public_relations": 0.0,
2565
+ "mmlu_security_studies": 0.0,
2566
+ "mmlu_social_sciences": "N/A",
2567
+ "mmlu_sociology": 0.0,
2568
+ "mmlu_stem": "N/A",
2569
+ "mmlu_us_foreign_policy": 0.0,
2570
+ "mmlu_virology": 0.0,
2571
+ "mmlu_world_religions": 0.0
2572
+ },
2573
+ "n-shot": {
2574
+ "mmlu": 0,
2575
+ "mmlu_abstract_algebra": 5,
2576
+ "mmlu_anatomy": 5,
2577
+ "mmlu_astronomy": 5,
2578
+ "mmlu_business_ethics": 5,
2579
+ "mmlu_clinical_knowledge": 5,
2580
+ "mmlu_college_biology": 5,
2581
+ "mmlu_college_chemistry": 5,
2582
+ "mmlu_college_computer_science": 5,
2583
+ "mmlu_college_mathematics": 5,
2584
+ "mmlu_college_medicine": 5,
2585
+ "mmlu_college_physics": 5,
2586
+ "mmlu_computer_security": 5,
2587
+ "mmlu_conceptual_physics": 5,
2588
+ "mmlu_econometrics": 5,
2589
+ "mmlu_electrical_engineering": 5,
2590
+ "mmlu_elementary_mathematics": 5,
2591
+ "mmlu_formal_logic": 5,
2592
+ "mmlu_global_facts": 5,
2593
+ "mmlu_high_school_biology": 5,
2594
+ "mmlu_high_school_chemistry": 5,
2595
+ "mmlu_high_school_computer_science": 5,
2596
+ "mmlu_high_school_european_history": 5,
2597
+ "mmlu_high_school_geography": 5,
2598
+ "mmlu_high_school_government_and_politics": 5,
2599
+ "mmlu_high_school_macroeconomics": 5,
2600
+ "mmlu_high_school_mathematics": 5,
2601
+ "mmlu_high_school_microeconomics": 5,
2602
+ "mmlu_high_school_physics": 5,
2603
+ "mmlu_high_school_psychology": 5,
2604
+ "mmlu_high_school_statistics": 5,
2605
+ "mmlu_high_school_us_history": 5,
2606
+ "mmlu_high_school_world_history": 5,
2607
+ "mmlu_human_aging": 5,
2608
+ "mmlu_human_sexuality": 5,
2609
+ "mmlu_humanities": 5,
2610
+ "mmlu_international_law": 5,
2611
+ "mmlu_jurisprudence": 5,
2612
+ "mmlu_logical_fallacies": 5,
2613
+ "mmlu_machine_learning": 5,
2614
+ "mmlu_management": 5,
2615
+ "mmlu_marketing": 5,
2616
+ "mmlu_medical_genetics": 5,
2617
+ "mmlu_miscellaneous": 5,
2618
+ "mmlu_moral_disputes": 5,
2619
+ "mmlu_moral_scenarios": 5,
2620
+ "mmlu_nutrition": 5,
2621
+ "mmlu_other": 5,
2622
+ "mmlu_philosophy": 5,
2623
+ "mmlu_prehistory": 5,
2624
+ "mmlu_professional_accounting": 5,
2625
+ "mmlu_professional_law": 5,
2626
+ "mmlu_professional_medicine": 5,
2627
+ "mmlu_professional_psychology": 5,
2628
+ "mmlu_public_relations": 5,
2629
+ "mmlu_security_studies": 5,
2630
+ "mmlu_social_sciences": 5,
2631
+ "mmlu_sociology": 5,
2632
+ "mmlu_stem": 5,
2633
+ "mmlu_us_foreign_policy": 5,
2634
+ "mmlu_virology": 5,
2635
+ "mmlu_world_religions": 5
2636
+ },
2637
+ "config": {
2638
+ "model": "hf",
2639
+ "model_args": "pretrained=HuggingFaceH4/mistral-7b-ift,revision=v31.3,dtype=bfloat16",
2640
+ "batch_size": "auto",
2641
+ "batch_sizes": [
2642
+ 16
2643
+ ],
2644
+ "device": null,
2645
+ "use_cache": null,
2646
+ "limit": null,
2647
+ "bootstrap_iters": 100000,
2648
+ "gen_kwargs": null
2649
+ },
2650
+ "git_hash": "901c20b"
2651
+ }