Models
stringlengths 5
34
| Data Source
stringclasses 3
values | Model Size(B)
stringlengths 1
5
| Overall
float64 0.11
0.84
| Biology
stringlengths 1
6
| Business
stringlengths 1
6
| Chemistry
stringlengths 1
6
| Computer Science
stringlengths 1
6
| Economics
stringlengths 1
6
| Engineering
stringlengths 1
6
| Health
stringlengths 1
6
| History
stringlengths 1
6
| Law
stringlengths 1
6
| Math
stringlengths 1
6
| Philosophy
stringlengths 1
6
| Physics
stringlengths 1
6
| Psychology
stringlengths 1
6
| Other
stringlengths 1
6
⌀ |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
SmolLM-360M | TIGER-Lab | 0.36 | 0.1095 | 0.1116 | 0.0989 | 0.1007 | 0.1098 | 0.1209 | 0.0877 | 0.1259 | 0.1129 | 0.1117 | 0.1044 | 0.1303 | 0.1001 | 0.1266 | 0.1169 |
Llama-3.2-1B | TIGER-Lab | 1 | 0.1195 | 0.1325 | 0.1077 | 0.1095 | 0.1122 | 0.1256 | 0.1342 | 0.1247 | 0.0892 | 0.1317 | 0.1021 | 0.1323 | 0.1147 | 0.1253 | 0.1277 |
Llama-3.2-3B | TIGER-Lab | 3 | 0.2217 | 0.3919 | 0.2041 | 0.1458 | 0.2049 | 0.3258 | 0.1538 | 0.2518 | 0.2782 | 0.1626 | 0.2021 | 0.2365 | 0.1493 | 0.3033 | 0.2543 |
SmolLM-1.7B | TIGER-Lab | 1.7 | 0.1193 | 0.1046 | 0.1052 | 0.1069 | 0.1268 | 0.1197 | 0.1259 | 0.1467 | 0.1181 | 0.1371 | 0.1110 | 0.1423 | 0.1132 | 0.1115 | 0.1169 |
Granite-3.0-2B-Base | TIGER-Lab | 2 | 0.2172 | 0.3445 | 0.1977 | 0.1564 | 0.2659 | 0.3033 | 0.1465 | 0.2298 | 0.1837 | 0.1653 | 0.2058 | 0.2385 | 0.1640 | 0.3271 | 0.2327 |
Granite-3.0-8B-Base | TIGER-Lab | 8 | 0.3103 | 0.4728 | 0.3080 | 0.2217 | 0.3268 | 0.4135 | 0.2136 | 0.3863 | 0.3307 | 0.2316 | 0.2805 | 0.3467 | 0.2494 | 0.4336 | 0.3149 |
SmolLM2-1.7B | TIGER-Lab | 1.7 | 0.1831 | 0.1925 | 0.1762 | 0.1422 | 0.1634 | 0.2536 | 0.1373 | 0.2054 | 0.1575 | 0.1444 | 0.1673 | 0.2144 | 0.1655 | 0.2845 | 0.2045 |
SmolLM2-360M | TIGER-Lab | 0.36 | 0.1138 | 0.1269 | 0.1242 | 0.1131 | 0.1024 | 0.1209 | 0.1135 | 0.1112 | 0.1155 | 0.1299 | 0.1088 | 0.1202 | 0.0993 | 0.1015 | 0.1115 |
SmolLM2-135M | TIGER-Lab | 0.135 | 0.1085 | 0.1185 | 0.1179 | 0.0945 | 0.0927 | 0.1209 | 0.0815 | 0.1186 | 0.1024 | 0.1199 | 0.1192 | 0.1182 | 0.1032 | 0.1140 | 0.0952 |
Aya-Expanse-8B | TIGER-Lab | 8 | 0.3374 | 0.5579 | 0.3739 | 0.2217 | 0.3585 | 0.4336 | 0.2497 | 0.3741 | 0.3255 | 0.2307 | 0.3605 | 0.2966 | 0.2671 | 0.4599 | 0.3517 |
Aya-Expanse-32B | TIGER-Lab | 32 | 0.4541 | 0.6165 | 0.4804 | 0.3481 | 0.5122 | 0.5829 | 0.2838 | 0.5428 | 0.4961 | 0.3415 | 0.4152 | 0.4609 | 0.3949 | 0.6103 | 0.5108 |
Athene-V2-Chat | TIGER-Lab | 72 | 0.7021 | 0.8243 | 0.7351 | 0.7208 | 0.7341 | 0.8045 | 0.5470 | 0.6834 | 0.6273 | 0.4832 | 0.8061 | 0.6273 | 0.7336 | 0.7694 | 0.7056 |
Claude-3-5-Haiku-20241022 | TIGER-Lab | unk | 0.6212 | 0.8075 | 0.6857 | 0.5936 | 0.6463 | 0.7287 | 0.4231 | 0.6614 | 0.622 | 0.4668 | 0.6173 | 0.5852 | 0.5835 | 0.7544 | 0.6645 |
Gemma-2-27B-it | TIGER-Lab | 27 | 0.5654 | 0.7796 | 0.6008 | 0.5371 | 0.5683 | 0.6979 | 0.3488 | 0.6186 | 0.5722 | 0.3951 | 0.5611 | 0.5230 | 0.5296 | 0.7155 | 0.6115 |
Mistral-Large-Instruct-2407 | TIGER-Lab | 123 | 0.6591 | 0.8271 | 0.6198 | 0.6307 | 0.7024 | 0.7571 | 0.4396 | 0.7152 | 0.6299 | 0.4823 | 0.7128 | 0.6453 | 0.6913 | 0.7707 | 0.6786 |
Mixtral-8x22B-Instruct-v0.1 | TIGER-Lab | 176 | 0.5633 | 0.7517 | 0.6033 | 0.4885 | 0.6415 | 0.6742 | 0.4324 | 0.6149 | 0.5643 | 0.3878 | 0.5655 | 0.5491 | 0.5089 | 0.7030 | 0.5996 |
Claude-3-Haiku-20240307 | TIGER-Lab | unk | 0.4229 | 0.7099 | 0.3663 | 0.2739 | 0.4049 | 0.5723 | 0.3209 | 0.5306 | 0.4357 | 0.3397 | 0.3397 | 0.4629 | 0.3226 | 0.6466 | 0.4556 |
Athene-V2-Chat (0-shot) | TIGER-Lab | 72 | 0.7311 | 0.8689 | 0.7921 | 0.7473 | 0.7463 | 0.8081 | 0.5996 | 0.7164 | 0.6404 | 0.5095 | 0.8364 | 0.6293 | 0.7737 | 0.7857 | 0.7219 |
EXAONE-3.5-2.4B-Instruct | TIGER-Lab | 2.4 | 0.391 | 0.6541 | 0.3942 | 0.3171 | 0.4415 | 0.5261 | 0.3168 | 0.3851 | 0.3727 | 0.2498 | 0.4323 | 0.3026 | 0.3472 | 0.5038 | 0.3387 |
EXAONE-3.5-7.8B-Instruct | TIGER-Lab | 7.8 | 0.4624 | 0.7308 | 0.4740 | 0.3719 | 0.5537 | 0.6066 | 0.3767 | 0.4707 | 0.4514 | 0.2870 | 0.4996 | 0.4248 | 0.3818 | 0.5965 | 0.4426 |
EXAONE-3.5-32B-Instruct | TIGER-Lab | 32 | 0.5891 | 0.7573 | 0.6502 | 0.5830 | 0.6634 | 0.6908 | 0.4654 | 0.6002 | 0.5328 | 0.4005 | 0.6691 | 0.5371 | 0.5350 | 0.6830 | 0.5617 |
QwQ-32B | TIGER-Lab | 32 | 0.6907 | 0.8410 | 0.7668 | 0.7041 | 0.7366 | 0.7832 | 0.4871 | 0.7066 | 0.6325 | 0.4777 | 0.7876 | 0.6713 | 0.7028 | 0.7632 | 0.6537 |
Llama-3.3-70B-Instruct | TIGER-Lab | 70 | 0.6592 | 0.8187 | 0.6857 | 0.6246 | 0.7073 | 0.7784 | 0.4665 | 0.7115 | 0.6614 | 0.4796 | 0.6891 | 0.6353 | 0.6428 | 0.7794 | 0.6818 |
Mistral-Large-Instruct-2411 | TIGER-Lab | 123 | 0.6794 | 0.8368 | 0.7186 | 0.6290 | 0.7683 | 0.7642 | 0.5119 | 0.7457 | 0.6640 | 0.5041 | 0.7091 | 0.6533 | 0.6605 | 0.7845 | 0.7067 |
Gemini-2.0-Flash-exp | TIGER-Lab | unk | 0.7624 | 0.8836 | 0.7985 | 0.8004 | 0.799 | 0.8169 | 0.6155 | 0.7442 | 0.7008 | 0.5647 | 0.8638 | 0.6994 | 0.8127 | 0.7905 | 0.7476 |
Granite-3.1-1B-A400M-Base | TIGER-Lab | 1 | 0.1234 | 0.1353 | 0.1153 | 0.1246 | 0.1415 | 0.1422 | 0.0980 | 0.1308 | 0.1234 | 0.1126 | 0.1310 | 0.1683 | 0.1124 | 0.1078 | 0.1212 |
Granite-3.1-1B-A400M-Instruct | TIGER-Lab | 1 | 0.1327 | 0.1437 | 0.1267 | 0.1148 | 0.1610 | 0.1576 | 0.1125 | 0.1638 | 0.1129 | 0.1253 | 0.1303 | 0.1202 | 0.1209 | 0.1617 | 0.1288 |
Granite-3.1-2B-Base | TIGER-Lab | 2 | 0.2389 | 0.3752 | 0.2560 | 0.1696 | 0.2439 | 0.3092 | 0.1930 | 0.2604 | 0.2178 | 0.1253 | 0.2487 | 0.2525 | 0.1986 | 0.3421 | 0.2565 |
Granite-3.1-2B-Instruct | TIGER-Lab | 2 | 0.3197 | 0.5007 | 0.3308 | 0.2412 | 0.3707 | 0.4111 | 0.2528 | 0.3056 | 0.3045 | 0.2180 | 0.3442 | 0.2846 | 0.2648 | 0.4411 | 0.3258 |
Granite-3.1-3B-A800M-Base | TIGER-Lab | 3 | 0.2039 | 0.2957 | 0.1762 | 0.1405 | 0.2268 | 0.2737 | 0.1527 | 0.2286 | 0.1995 | 0.1444 | 0.2198 | 0.2305 | 0.1640 | 0.3083 | 0.1926 |
Granite-3.1-3B-A800M-Instruct | TIGER-Lab | 3 | 0.2542 | 0.3431 | 0.2725 | 0.1608 | 0.2829 | 0.3626 | 0.1744 | 0.2641 | 0.2415 | 0.1708 | 0.2754 | 0.2806 | 0.2017 | 0.3810 | 0.2706 |
Granite-3.1-8B-Base | TIGER-Lab | 8 | 0.3308 | 0.4979 | 0.3181 | 0.2403 | 0.3390 | 0.4372 | 0.2425 | 0.3716 | 0.3412 | 0.2044 | 0.3249 | 0.3567 | 0.2748 | 0.4862 | 0.3636 |
Granite-3.1-8B-Instruct | TIGER-Lab | 8 | 0.4103 | 0.5746 | 0.4563 | 0.3145 | 0.4244 | 0.5047 | 0.2910 | 0.4707 | 0.4121 | 0.2607 | 0.4189 | 0.4329 | 0.3472 | 0.5739 | 0.4405 |
Deepseek-V3 | Self-Reported | 671 | 0.7587 | 0.8689 | 0.8099 | 0.7951 | 0.7951 | 0.8175 | 0.6223 | 0.7372 | 0.6824 | 0.5477 | 0.8616 | 0.7154 | 0.7898 | 0.7882 | 0.7641 |
QwQ-32B-Preview | Self-Reported | 32 | 0.7097 | 0.8490 | 0.7506 | 0.7429 | 0.4571 | 0.5469 | 0.6569 | 0.7784 | 0.7506 | 0.7605 | 0.6916 | 0.8494 | 0.6573 | 0.7732 | null |
SkyThought-T1 | Self-Reported | 32 | 0.692 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
MiniMax-Text-01 | Self-Reported | 456 | 0.757 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
Llama-3.1-405B-Instruct | Self-Reported | 405 | 0.733 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
Llama-3.1-405B | Self-Reported | 405 | 0.616 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
Internlm3-8B-Instruct | Self-Reported | 8 | 0.576 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
Hunyuan-Large | Self-Reported | 389 | 0.602 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
Phi-4 | Self-Reported | 14 | 0.704 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
GPT-o1-mini | Self-Reported | unk | 0.803 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
GPT-4o (2024-11-20) | Self-Reported | unk | 0.779 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
Claude-3.5-Sonnet (2024-10-22) | Self-Reported | unk | 0.78 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
DeepSeek-R1 | Self-Reported | 671 | 0.84 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
Doubao-1.5-Pro | Self-Reported | unk | 0.801 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
Qwen2.5-Max | Self-Reported | unk | 0.761 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
Mistral-Small-base | Self-Reported | 24 | 0.544 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
Mistral-Small-instruct | Self-Reported | 24 | 0.663 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
Gemini-2.0-Flash-Lite | Self-Reported | unk | 0.716 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
Gemini-2.0-Flash | Self-Reported | unk | 0.776 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
Gemini-2.0-Pro | Self-Reported | unk | 0.791 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
azerogpt | Self-Reported | unk | 0.6307 | 0.8215 | 0.6667 | 0.508 | 0.6683 | 0.7393 | 0.4828 | 0.665 | 0.6693 | 0.4587 | 0.6329 | 0.6212 | 0.5751 | 0.7506 | 0.6591 |