sha
stringlengths 40
40
| text
stringlengths 0
13.4M
| id
stringlengths 2
117
| tags
sequence | created_at
stringlengths 25
25
| metadata
stringlengths 2
31.7M
| last_modified
stringlengths 25
25
|
---|---|---|---|---|---|---|
e5820ff05e32f1533bba65ea2ca891ecb478e1f3 | ViktorKartierg/DanielPenis | [
"region:us"
] | 2024-01-13T00:30:35+00:00 | {} | 2024-01-13T00:31:31+00:00 |
|
65429422dff91f655aaa434a0dc1495125503b53 | SuperPupperDoggo/Star-Trek-Starships | [
"region:us"
] | 2024-01-13T00:35:05+00:00 | {} | 2024-01-13T01:51:05+00:00 |
|
fa9c1ef613fe8a7014d1fe6c731b3e222ec56c60 | Xenova/siglip-semantic-image-search-assets | [
"region:us"
] | 2024-01-13T00:55:51+00:00 | {} | 2024-01-13T01:34:40+00:00 |
|
12a911947af92d4e8f4fcbb8805dcc25d841a55a | GustavoBr497/b1zy | [
"region:us"
] | 2024-01-13T01:15:04+00:00 | {} | 2024-01-13T01:57:41+00:00 |
|
9fb2910550f4e00e706289360aa094faaaa58124 | Wanfq/fusechat_v1 | [
"region:us"
] | 2024-01-13T01:39:20+00:00 | {} | 2024-01-13T01:40:13+00:00 |
|
db94126f3b648e7f7d7d91cc58775a9f335f3467 | badokorach/Dataset130124 | [
"region:us"
] | 2024-01-13T01:43:41+00:00 | {} | 2024-01-13T01:44:12+00:00 |
|
98f77c38f0a5655dc28b2f31614a2a7ef8d8328f | zhaospei/refine_scg | [
"region:us"
] | 2024-01-13T01:47:29+00:00 | {} | 2024-01-13T01:59:55+00:00 |
|
accd40fa648d9c4c263ee7ddb42fcd2ca5a2a6de | yiyic/MTG_QG_DE | [
"region:us"
] | 2024-01-13T01:49:53+00:00 | {} | 2024-01-13T01:51:28+00:00 |
|
1e9713f36d796dbffefe7b4a92199c6009b219c3 | bhargavi909/Cancer_research | [
"region:us"
] | 2024-01-13T01:57:44+00:00 | {} | 2024-01-13T01:57:44+00:00 |
|
2554f1785dc8c2ee2e5aab73434b152c8953e894 | erlinersi/lzscpvz | [
"region:us"
] | 2024-01-13T02:12:03+00:00 | {} | 2024-01-13T07:41:42+00:00 |
|
52d82165949ded43a48f47fb00ba34d398581364 | zhaospei/sample_30k | [
"region:us"
] | 2024-01-13T02:24:41+00:00 | {} | 2024-01-13T02:25:18+00:00 |
|
70a9c013bb9ea61a605b17c5c7f083ad52bc71d4 | badokorach/Dataset1301241 | [
"region:us"
] | 2024-01-13T02:37:57+00:00 | {} | 2024-01-14T14:23:22+00:00 |
|
f698f128c9bb9a15f0ccc5c771223f92481b9c47 | oivlisnet/c4-en | [
"region:us"
] | 2024-01-13T02:45:21+00:00 | {} | 2024-01-13T03:00:17+00:00 |
|
43b5ae2b7a29a6247117e67df2586387b27c9480 | AhilanPonnusamy/InvestmentsDataset | [
"region:us"
] | 2024-01-13T03:38:49+00:00 | {} | 2024-01-15T01:50:03+00:00 |
|
c6da517719b08a8c9d90af1ee434369f766e99bb | s1ghhh/tmp0113 | [
"region:us"
] | 2024-01-13T03:47:24+00:00 | {} | 2024-01-15T12:38:38+00:00 |
|
264fdd7bb190a076d621153221da51cb8451153f | eduagarcia-temp/oab_exams_no_train | [
"region:us"
] | 2024-01-13T04:11:31+00:00 | {} | 2024-01-13T18:48:07+00:00 |
|
3dd8f4ba585893014b21bf5a3aa5f233430461c8 | joaojoao/alissaov2 | [
"region:us"
] | 2024-01-13T04:12:45+00:00 | {} | 2024-01-13T04:12:56+00:00 |
|
fe24cf6fa2b76ae8bbf44aff2168be89ee33f68d | thuxtk/gary-jule | [
"region:us"
] | 2024-01-13T04:38:15+00:00 | {} | 2024-01-13T04:47:00+00:00 |
|
b72c37a6f52f93e76ef71efab1678ce81b2cc9b3 | ShreeyaVenneti/TRAIN_SET_CSR_AS_REFERENCE_2COLUMNS_50entries | [
"region:us"
] | 2024-01-13T05:38:17+00:00 | {} | 2024-01-13T05:38:43+00:00 |
|
e855b3c713f3cd530f0b89d8b3b5988e24917ec0 | AjaxXML/alissao | [
"region:us"
] | 2024-01-13T07:07:51+00:00 | {} | 2024-01-13T07:09:29+00:00 |
|
4bf3d1587f294a93cb1f4d943235d6aaaa7305f0 | Chinchis/Herramientas_reescalado | [
"region:us"
] | 2024-01-13T07:46:14+00:00 | {} | 2024-01-13T07:47:18+00:00 |
|
845f6f604ac832e18d4f4582372d315b39e51a63 | radestijn/AutoOrca | [
"region:us"
] | 2024-01-13T08:27:32+00:00 | {} | 2024-01-13T08:27:32+00:00 |
|
a37ab5782f48d165a293f8b3463ed85d2e585fe6 | jiiishnu-17/url.dataset | [
"region:us"
] | 2024-01-13T08:30:48+00:00 | {} | 2024-01-13T08:30:48+00:00 |
|
2a9950740d1f5c0efa66c4249444e3f663a85903 | AlejandroLanaspa/Jewelry | [
"region:us"
] | 2024-01-13T08:36:18+00:00 | {} | 2024-01-13T10:34:26+00:00 |
|
9743e663b4e2148439086ae99b7e5755ca2b04a9 | xiaoguang123/Pictures | [
"region:us"
] | 2024-01-13T08:45:11+00:00 | {} | 2024-01-13T08:45:11+00:00 |
|
20d9d9de54faae2bb3c4a5e1c1b037009007c35b | chriscelaya/mini-platypus | [
"region:us"
] | 2024-01-13T09:14:18+00:00 | {} | 2024-01-13T09:14:18+00:00 |
|
7c8d592436914b0606c49227a9f11d1be39d6fc7 | MattyMroz/SoloLeveling | [
"region:us"
] | 2024-01-13T11:14:35+00:00 | {} | 2024-01-13T12:55:52+00:00 |
|
3bda59e21da3b16f19632de5214a7e03a2ee367f | cutterd/gelgen_tar_0 | [
"region:us"
] | 2024-01-13T11:22:43+00:00 | {} | 2024-01-13T12:39:12+00:00 |
|
507c57a86458f80cc4e0823a83905569a2c28e20 | cutterd/gelgen_tar_2 | [
"region:us"
] | 2024-01-13T11:22:43+00:00 | {} | 2024-01-13T12:41:37+00:00 |
|
c60f6018032b5111e899655381d4d05743748e8b | cutterd/gelgen_tar_1 | [
"region:us"
] | 2024-01-13T11:22:43+00:00 | {} | 2024-01-13T12:40:56+00:00 |
|
83a2cdf17c12369cff402703aa4bc089bc6f7b5f | cutterd/gelgen_tar_3 | [
"region:us"
] | 2024-01-13T11:22:43+00:00 | {} | 2024-01-13T12:39:16+00:00 |
|
3500fca51def95f5bf4e785642405fb621f68268 | cutterd/gelgen_tar_35 | [
"region:us"
] | 2024-01-13T11:22:55+00:00 | {} | 2024-01-13T12:16:10+00:00 |
|
8a8e22fd3762b82d0591964002971cae88566d44 | cutterd/gelgen_tar_34 | [
"region:us"
] | 2024-01-13T11:22:55+00:00 | {} | 2024-01-13T12:15:33+00:00 |
|
e49769cc4784913ef1820d0fb4efde61c6a98b9e | cutterd/gelgen_tar_33 | [
"region:us"
] | 2024-01-13T11:22:55+00:00 | {} | 2024-01-13T12:12:17+00:00 |
|
73cb70b339c7a322befa2aec36e957292f9da481 | cutterd/gelgen_tar_32 | [
"region:us"
] | 2024-01-13T11:22:55+00:00 | {} | 2024-01-13T12:15:11+00:00 |
|
5e34965dd20f876efe266283e4b2e9d53db157fc | cutterd/gelgen_tar_26 | [
"region:us"
] | 2024-01-13T11:23:07+00:00 | {} | 2024-01-13T12:40:57+00:00 |
|
ac0eef0e54c22fe920c12a61c3c7c3cb438c2f12 | cutterd/gelgen_tar_27 | [
"region:us"
] | 2024-01-13T11:23:07+00:00 | {} | 2024-01-13T12:39:05+00:00 |
|
deac3a195af2e41879f8a30a5e719c354acbacd0 | cutterd/gelgen_tar_24 | [
"region:us"
] | 2024-01-13T11:23:07+00:00 | {} | 2024-01-13T12:40:21+00:00 |
|
76ba093dc3f8b390cafb4956e50a0d95eb8b9982 | cutterd/gelgen_tar_25 | [
"region:us"
] | 2024-01-13T11:23:07+00:00 | {} | 2024-01-13T12:38:09+00:00 |
|
9721bc135dc06adba40095e958555b6fda02f74f | cutterd/gelgen_tar_28 | [
"region:us"
] | 2024-01-13T11:23:20+00:00 | {} | 2024-01-13T12:33:41+00:00 |
|
527ae2b57cac30f7962df9bf497b99053c1a6f02 | cutterd/gelgen_tar_29 | [
"region:us"
] | 2024-01-13T11:23:20+00:00 | {} | 2024-01-13T12:38:36+00:00 |
|
d6640e9f44b65419decebc1a7e5afbbc9287eee2 | cutterd/gelgen_tar_30 | [
"region:us"
] | 2024-01-13T11:23:20+00:00 | {} | 2024-01-13T12:37:33+00:00 |
|
0ae45e3d573ec812e9484709dca40347c3ed8915 | cutterd/gelgen_tar_31 | [
"region:us"
] | 2024-01-13T11:23:20+00:00 | {} | 2024-01-13T12:39:11+00:00 |
|
e718ac01407bed4892ce12c674df63b9afa76707 | cutterd/gelgen_tar_38 | [
"region:us"
] | 2024-01-13T11:23:21+00:00 | {} | 2024-01-13T12:42:23+00:00 |
|
d5391901a138349d59cc949acd4dccd8edd490cd | cutterd/gelgen_tar_37 | [
"region:us"
] | 2024-01-13T11:23:21+00:00 | {} | 2024-01-13T12:42:50+00:00 |
|
8c40f8fcde02df962bdc894b90ec46d7bff6703d | cutterd/gelgen_tar_39 | [
"region:us"
] | 2024-01-13T11:23:21+00:00 | {} | 2024-01-13T12:43:06+00:00 |
|
c4fa37f0ff36d16fd8dcb96a5df2b69e78726a3d | cutterd/gelgen_tar_36 | [
"region:us"
] | 2024-01-13T11:23:22+00:00 | {} | 2024-01-13T12:41:24+00:00 |
|
ce575a37cd89ec1a049eb51f6d4027fa93ff861c | OthnnyEL/fgan-annotate-dataset | [
"region:us"
] | 2024-01-13T12:12:14+00:00 | {} | 2024-01-13T12:12:14+00:00 |
|
68481d611e17759532f811a7a899ef111789fcb8 | montebello-642/civillian-complaints | [
"region:us"
] | 2024-01-13T12:48:02+00:00 | {} | 2024-01-13T12:49:37+00:00 |
|
3b6227fe3c7ac7fdb008b9126700d612446367f5 | cutterd/gelgen_tar_45 | [
"region:us"
] | 2024-01-13T13:05:38+00:00 | {} | 2024-01-13T14:00:41+00:00 |
|
635ee7ecaa6b83a7a8b06ee7bc54ed30c931cc2e | cutterd/gelgen_tar_46 | [
"region:us"
] | 2024-01-13T13:05:38+00:00 | {} | 2024-01-13T13:59:01+00:00 |
|
2960b4ab43b42b1e52dc4e5da38095bbee62d42e | cutterd/gelgen_tar_47 | [
"region:us"
] | 2024-01-13T13:05:38+00:00 | {} | 2024-01-13T13:59:58+00:00 |
|
e8d70531b676ada7f3f3ed30bfb65a390dd3705b | cutterd/gelgen_tar_44 | [
"region:us"
] | 2024-01-13T13:05:38+00:00 | {} | 2024-01-13T14:00:01+00:00 |
|
dc3530424c79872393356e6d0eac8b3ae299025d | cutterd/gelgen_tar_50 | [
"region:us"
] | 2024-01-13T13:05:49+00:00 | {} | 2024-01-13T13:59:12+00:00 |
|
d14742097a5e7818e662494ba2041d9eebee226e | cutterd/gelgen_tar_48 | [
"region:us"
] | 2024-01-13T13:05:49+00:00 | {} | 2024-01-13T13:57:25+00:00 |
|
46648e3a291d8072688c732649b7be5108296a75 | cutterd/gelgen_tar_51 | [
"region:us"
] | 2024-01-13T13:05:49+00:00 | {} | 2024-01-13T14:00:31+00:00 |
|
818b6eab5a2cfc9be23a9ad85f9572eddc891f4b | cutterd/gelgen_tar_49 | [
"region:us"
] | 2024-01-13T13:05:49+00:00 | {} | 2024-01-13T14:00:33+00:00 |
|
f1da79bbc42579738fa30c1c313f32f69d0e2b10 | cutterd/gelgen_tar_53 | [
"region:us"
] | 2024-01-13T13:06:00+00:00 | {} | 2024-01-13T14:12:02+00:00 |
|
b239f868549afb5faf2488c847cc80d21f7c9906 | cutterd/gelgen_tar_55 | [
"region:us"
] | 2024-01-13T13:06:00+00:00 | {} | 2024-01-13T14:13:29+00:00 |
|
41d24169fb57a7c404fe5aff3cfb0e4ec40955f5 | cutterd/gelgen_tar_52 | [
"region:us"
] | 2024-01-13T13:06:00+00:00 | {} | 2024-01-13T14:13:32+00:00 |
|
9b6ef1fd10be973a0dd7b6bbe6e9f08fb8160a47 | cutterd/gelgen_tar_57 | [
"region:us"
] | 2024-01-13T13:06:44+00:00 | {} | 2024-01-13T14:22:11+00:00 |
|
c251f032f511c513f483d82d95bdb52fab0d7a6a | cutterd/gelgen_tar_56 | [
"region:us"
] | 2024-01-13T13:06:44+00:00 | {} | 2024-01-13T14:20:35+00:00 |
|
6b04f1479252b874badede764d5ddf57b44b6f84 | cutterd/gelgen_tar_58 | [
"region:us"
] | 2024-01-13T13:06:44+00:00 | {} | 2024-01-13T14:21:44+00:00 |
|
e2e4abf6ed28ec1213af023c3e06cbbc1c0f3f4f | cutterd/gelgen_tar_59 | [
"region:us"
] | 2024-01-13T13:06:44+00:00 | {} | 2024-01-13T14:23:30+00:00 |
|
6a90d76e9deb18657d8cdbb6e754528c43c3d4b9 | cutterd/gelgen_tar_42 | [
"region:us"
] | 2024-01-13T13:07:24+00:00 | {} | 2024-01-13T14:21:53+00:00 |
|
907804a9be164637ecff658f8df1abb349a51a25 | cutterd/gelgen_tar_40 | [
"region:us"
] | 2024-01-13T13:07:24+00:00 | {} | 2024-01-13T14:21:17+00:00 |
|
ab98c3c30c9dd98ecefba1ada30f1113eeadb673 | cutterd/gelgen_tar_41 | [
"region:us"
] | 2024-01-13T13:07:24+00:00 | {} | 2024-01-13T14:20:24+00:00 |
|
61e984cb16bbdf3342ad84b49cb7354487ae9ae3 | cutterd/gelgen_tar_43 | [
"region:us"
] | 2024-01-13T13:07:24+00:00 | {} | 2024-01-13T14:21:00+00:00 |
|
3b41e89bae6ca8b1a4c6d10b1d5eb66e54da938a | cutterd/gelgen_tar_62 | [
"region:us"
] | 2024-01-13T13:21:12+00:00 | {} | 2024-01-13T14:16:16+00:00 |
|
b984a8d2ac4423ba546cfbe161fb3aec850f239b | cutterd/gelgen_tar_60 | [
"region:us"
] | 2024-01-13T13:21:12+00:00 | {} | 2024-01-13T14:13:29+00:00 |
|
81668c5ad65f9a784ddc9ffb9a40966bad3c90a9 | cutterd/gelgen_tar_61 | [
"region:us"
] | 2024-01-13T13:21:12+00:00 | {} | 2024-01-13T14:17:25+00:00 |
|
3f270401e700a1d10b375e98ca35a281f92c9cd8 | cutterd/gelgen_tar_63 | [
"region:us"
] | 2024-01-13T13:21:12+00:00 | {} | 2024-01-13T14:15:53+00:00 |
|
497f165e995f6f9b6ca58fa9352f84d8c088bade | namankhator/web_questions_custom | [
"region:us"
] | 2024-01-13T13:33:32+00:00 | {} | 2024-01-13T16:52:01+00:00 |
|
8b6c8f4d95fb0b14d36b511672f9f39ffbd2784c | v4lkyr13/kevin | [
"region:us"
] | 2024-01-13T13:36:32+00:00 | {} | 2024-01-13T13:36:33+00:00 |
|
6598e47e85546253ebf132918c988d841c5c4f51 | ryo2/roly-poly-dataset | [
"region:us"
] | 2024-01-13T13:43:30+00:00 | {} | 2024-01-13T14:04:10+00:00 |
|
dda75a6cecd48f6d65be7d59a409df5232ad913f | dashtail/vozdodashtest | [
"region:us"
] | 2024-01-13T14:01:39+00:00 | {} | 2024-01-13T14:36:22+00:00 |
|
8b3a90b2bb78f5b0cc5959288589c1e631b2927c | pbk0/scappy | [
"region:us"
] | 2024-01-13T14:24:49+00:00 | {} | 2024-01-13T14:24:49+00:00 |
|
abfea3af41b221167305acaa37d04533473ccaab | Subuday/lj_speech | [
"region:us"
] | 2024-01-13T14:42:00+00:00 | {} | 2024-01-13T14:51:04+00:00 |
|
4cb59fdf19285d26317ff43eac0b09b618629b6d | TheOriginalMarcelo/dataset_nlp_enem_2k23 | [
"region:us"
] | 2024-01-13T14:43:28+00:00 | {} | 2024-01-13T14:47:03+00:00 |
|
d58e234ddcc1ffc9381068a94b47d88589830cf7 | Janez/mini-platypus-two | [
"region:us"
] | 2024-01-13T15:07:55+00:00 | {} | 2024-01-13T15:07:55+00:00 |
|
a960ce3b8d30a459436d62a3842f933b7c5a695a | sunshine-lwt/Osprey-ValData | [
"region:us"
] | 2024-01-13T15:09:43+00:00 | {} | 2024-01-13T15:29:21+00:00 |
|
24d01c36a763e781ee5102aa54a2ca0e57df2c49 | Janez/mini-platypus-p1 | [
"region:us"
] | 2024-01-13T15:10:25+00:00 | {} | 2024-01-13T15:10:26+00:00 |
|
fbcf9e66cdbdbce83b5a223b34b03c10dfa8522b | ahmettasdemir/mini-platypus | [
"region:us"
] | 2024-01-13T15:31:23+00:00 | {} | 2024-01-13T15:31:26+00:00 |
|
fe96bcc6d46050212d79c9e971dcd79d6aaf8752 | joshmittal/mini-platypus | [
"region:us"
] | 2024-01-13T15:37:44+00:00 | {} | 2024-01-13T15:37:44+00:00 |
|
5ceca22880cbdc1b3ec3972b17a6777535c3e15e | firqaaa/T2Rerank-Bahasa | [
"region:us"
] | 2024-01-13T15:55:17+00:00 | {} | 2024-01-13T15:55:17+00:00 |
|
33d5c482d838aea4a816f83963b3ef76b0cb3040 | damerajee/en-kannada-v3 | [
"region:us"
] | 2024-01-13T16:27:23+00:00 | {} | 2024-01-13T16:27:23+00:00 |
|
b0992df96c871f002405549adeb26b0f66f34490 | shokhjakhon/open_llama-2-9k | [
"region:us"
] | 2024-01-13T16:49:26+00:00 | {} | 2024-01-13T16:49:26+00:00 |
|
cc61a2e51f8beeca6ad5e9bb086802df67c08d66 | Yankz/mini-platypus | [
"region:us"
] | 2024-01-13T16:56:33+00:00 | {} | 2024-01-13T16:56:33+00:00 |
|
2c84d52ca4bc18f76d13c124e177a633ef78cf04 | sizhkhy/kolektor_sdd2 | [
"region:us"
] | 2024-01-13T17:16:41+00:00 | {"dataset_info": {"features": [{"name": "path", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "label", "dtype": "string"}, {"name": "image", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 610807578.625, "num_examples": 2331}, {"name": "valid", "num_bytes": 263360243.5, "num_examples": 1004}], "download_size": 874061649, "dataset_size": 874167822.125}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "valid", "path": "data/valid-*"}]}]} | 2024-01-13T18:22:27+00:00 |
|
bbe613e1fa533647369080785fa2069cfac9e766 |
# Dataset Card for Evaluation run of alnrg2arg/test_wanda_240109
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [alnrg2arg/test_wanda_240109](https://huggingface.co/alnrg2arg/test_wanda_240109) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_alnrg2arg__test_wanda_240109",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T17:19:19.094893](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__test_wanda_240109/blob/main/results_2024-01-13T17-19-19.094893.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23401038489636644,
"acc_stderr": 0.029968361313724278,
"acc_norm": 0.23351347966222002,
"acc_norm_stderr": 0.0307471687800331,
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.22525597269624573,
"acc_stderr": 0.012207839995407305,
"acc_norm": 0.2295221843003413,
"acc_norm_stderr": 0.012288926760890797
},
"harness|hellaswag|10": {
"acc": 0.25542720573590916,
"acc_stderr": 0.004352098082984431,
"acc_norm": 0.2526389165504879,
"acc_norm_stderr": 0.004336375492801798
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.29605263157894735,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.29605263157894735,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235173,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15763546798029557,
"acc_stderr": 0.025639014131172404,
"acc_norm": 0.15763546798029557,
"acc_norm_stderr": 0.025639014131172404
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.17,
"acc_stderr": 0.037752516806863715,
"acc_norm": 0.17,
"acc_norm_stderr": 0.037752516806863715
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2900763358778626,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.2900763358778626,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2892561983471074,
"acc_stderr": 0.041391127276354626,
"acc_norm": 0.2892561983471074,
"acc_norm_stderr": 0.041391127276354626
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.22349936143039592,
"acc_stderr": 0.014897235229450707,
"acc_norm": 0.22349936143039592,
"acc_norm_stderr": 0.014897235229450707
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.26878612716763006,
"acc_stderr": 0.023868003262500114,
"acc_norm": 0.26878612716763006,
"acc_norm_stderr": 0.023868003262500114
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767864,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767864
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2508038585209003,
"acc_stderr": 0.024619771956697165,
"acc_norm": 0.2508038585209003,
"acc_norm_stderr": 0.024619771956697165
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.4988161010260458,
"acc_stderr": 0.014052446290529019
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_alnrg2arg__test_wanda_240109 | [
"region:us"
] | 2024-01-13T17:17:03+00:00 | {"pretty_name": "Evaluation run of alnrg2arg/test_wanda_240109", "dataset_summary": "Dataset automatically created during the evaluation run of model [alnrg2arg/test_wanda_240109](https://huggingface.co/alnrg2arg/test_wanda_240109) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_alnrg2arg__test_wanda_240109\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T17:19:19.094893](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__test_wanda_240109/blob/main/results_2024-01-13T17-19-19.094893.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23401038489636644,\n \"acc_stderr\": 0.029968361313724278,\n \"acc_norm\": 0.23351347966222002,\n \"acc_norm_stderr\": 0.0307471687800331,\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22525597269624573,\n \"acc_stderr\": 0.012207839995407305,\n \"acc_norm\": 0.2295221843003413,\n \"acc_norm_stderr\": 0.012288926760890797\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25542720573590916,\n \"acc_stderr\": 0.004352098082984431,\n \"acc_norm\": 0.2526389165504879,\n \"acc_norm_stderr\": 0.004336375492801798\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.29605263157894735,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.29605263157894735,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n \"acc_stderr\": 0.03970158273235173,\n \"acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.03970158273235173\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15763546798029557,\n \"acc_stderr\": 0.025639014131172404,\n \"acc_norm\": 0.15763546798029557,\n \"acc_norm_stderr\": 0.025639014131172404\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.037752516806863715,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.037752516806863715\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176892,\n \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176892\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2900763358778626,\n \"acc_stderr\": 0.03980066246467765,\n \"acc_norm\": 0.2900763358778626,\n \"acc_norm_stderr\": 0.03980066246467765\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2892561983471074,\n \"acc_stderr\": 0.041391127276354626,\n \"acc_norm\": 0.2892561983471074,\n \"acc_norm_stderr\": 0.041391127276354626\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.22349936143039592,\n \"acc_stderr\": 0.014897235229450707,\n \"acc_norm\": 0.22349936143039592,\n \"acc_norm_stderr\": 0.014897235229450707\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.26878612716763006,\n \"acc_stderr\": 0.023868003262500114,\n \"acc_norm\": 0.26878612716763006,\n \"acc_norm_stderr\": 0.023868003262500114\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n \"acc_stderr\": 0.014355911964767864,\n \"acc_norm\": 0.2435754189944134,\n \"acc_norm_stderr\": 0.014355911964767864\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2508038585209003,\n \"acc_stderr\": 0.024619771956697165,\n \"acc_norm\": 0.2508038585209003,\n \"acc_norm_stderr\": 0.024619771956697165\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4988161010260458,\n \"acc_stderr\": 0.014052446290529019\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/alnrg2arg/test_wanda_240109", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|arc:challenge|25_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|arc:challenge|25_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|gsm8k|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|gsm8k|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hellaswag|10_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hellaswag|10_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T17-14-43.764095.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T17-19-19.094893.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["**/details_harness|winogrande|5_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["**/details_harness|winogrande|5_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T17-19-19.094893.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T17_14_43.764095", "path": ["results_2024-01-13T17-14-43.764095.parquet"]}, {"split": "2024_01_13T17_19_19.094893", "path": ["results_2024-01-13T17-19-19.094893.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T17-19-19.094893.parquet"]}]}]} | 2024-01-13T17:21:58+00:00 |
2a5109f47f3ee42f69c51145df5d823de592857a |
# Dataset of trieste/トリエステ/的里雅斯特 (Azur Lane)
This is the dataset of trieste/トリエステ/的里雅斯特 (Azur Lane), containing 92 images and their tags.
The core tags of this character are `breasts, hair_over_one_eye, large_breasts, long_hair, green_eyes, pink_hair, ponytail, hair_ornament, mole, hairclip, bangs, mole_under_eye, earrings, purple_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 92 | 151.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/trieste_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 92 | 71.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/trieste_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 222 | 157.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/trieste_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 92 | 126.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/trieste_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 222 | 248.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/trieste_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/trieste_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | onsen, water, 1girl, cleavage, collarbone, looking_at_viewer, solo, blush, wet, nude_cover, sitting, steam, naked_towel, outdoors, night, partially_submerged, sky |
| 1 | 34 |  |  |  |  |  | 1girl, military_uniform, sideboob, solo, black_skirt, miniskirt, jewelry, aiguillette, epaulettes, pencil_skirt, looking_at_viewer, black_jacket, white_gloves, black_pantyhose, mole_on_body, cape, black_coat, simple_background, standing, cowboy_shot, hair_between_eyes, white_background, revealing_clothes, sidelocks, armpits, holding, clipboard |
| 2 | 26 |  |  |  |  |  | white_shirt, 1girl, looking_at_viewer, solo, short_sleeves, pleated_skirt, red_neckerchief, official_alternate_costume, miniskirt, crop_top, serafuku, black_pantyhose, blue_sailor_collar, blue_skirt, blush, navel, ribbon, midriff, see-through_silhouette, black_skirt, standing, wristwatch |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | onsen | water | 1girl | cleavage | collarbone | looking_at_viewer | solo | blush | wet | nude_cover | sitting | steam | naked_towel | outdoors | night | partially_submerged | sky | military_uniform | sideboob | black_skirt | miniskirt | jewelry | aiguillette | epaulettes | pencil_skirt | black_jacket | white_gloves | black_pantyhose | mole_on_body | cape | black_coat | simple_background | standing | cowboy_shot | hair_between_eyes | white_background | revealing_clothes | sidelocks | armpits | holding | clipboard | white_shirt | short_sleeves | pleated_skirt | red_neckerchief | official_alternate_costume | crop_top | serafuku | blue_sailor_collar | blue_skirt | navel | ribbon | midriff | see-through_silhouette | wristwatch |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------|:-----------|:-------------|:--------------------|:-------|:--------|:------|:-------------|:----------|:--------|:--------------|:-----------|:--------|:----------------------|:------|:-------------------|:-----------|:--------------|:------------|:----------|:--------------|:-------------|:---------------|:---------------|:---------------|:------------------|:---------------|:-------|:-------------|:--------------------|:-----------|:--------------|:--------------------|:-------------------|:--------------------|:------------|:----------|:----------|:------------|:--------------|:----------------|:----------------|:------------------|:-----------------------------|:-----------|:-----------|:---------------------|:-------------|:--------|:---------|:----------|:-------------------------|:-------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 34 |  |  |  |  |  | | | X | | | X | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 2 | 26 |  |  |  |  |  | | | X | | | X | X | X | | | | | | | | | | | | X | X | | | | | | | X | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/trieste_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T17:17:32+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T17:42:06+00:00 |
ea7e009ae71a99466156e7413336be1d0516857c |
# Dataset of guam/グアム/关岛 (Azur Lane)
This is the dataset of guam/グアム/关岛 (Azur Lane), containing 54 images and their tags.
The core tags of this character are `blonde_hair, breasts, long_hair, large_breasts, bangs, blue_eyes, very_long_hair, twintails, symbol-shaped_pupils, animal_ears, hat, purple_eyes, rabbit_ears, hair_ornament, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 54 | 116.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guam_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 54 | 52.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guam_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 147 | 120.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guam_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 54 | 96.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guam_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 147 | 194.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/guam_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/guam_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, bare_shoulders, blush, cleavage, fake_animal_ears, looking_at_viewer, open_mouth, playboy_bunny, solo, white_gloves, facial_tattoo, fang, simple_background, thighs, top_hat, two_side_up, white_background, :d, blue_leotard, thighhighs, collarbone, detached_collar, heart-shaped_pupils, holding, huge_breasts, ribbon |
| 1 | 5 |  |  |  |  |  | 1girl, black_gloves, covered_navel, fingerless_gloves, smile, solo, long_sleeves, looking_at_viewer, open_mouth, black_thighhighs, blush, hairclip, highleg_leotard, leotard_under_clothes, one_eye_closed, sidelocks, thighs, \m/, cowboy_shot, skindentation, standing, star_(symbol) |
| 2 | 5 |  |  |  |  |  | 1girl, black_gloves, blush, looking_at_viewer, smile, solo, :q, fingerless_gloves, +_+, hairclip, headgear, heart, simple_background, star_(symbol), upper_body, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | blush | cleavage | fake_animal_ears | looking_at_viewer | open_mouth | playboy_bunny | solo | white_gloves | facial_tattoo | fang | simple_background | thighs | top_hat | two_side_up | white_background | :d | blue_leotard | thighhighs | collarbone | detached_collar | heart-shaped_pupils | holding | huge_breasts | ribbon | black_gloves | covered_navel | fingerless_gloves | smile | long_sleeves | black_thighhighs | hairclip | highleg_leotard | leotard_under_clothes | one_eye_closed | sidelocks | \m/ | cowboy_shot | skindentation | standing | star_(symbol) | :q | +_+ | headgear | heart | upper_body |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------|:-----------|:-------------------|:--------------------|:-------------|:----------------|:-------|:---------------|:----------------|:-------|:--------------------|:---------|:----------|:--------------|:-------------------|:-----|:---------------|:-------------|:-------------|:------------------|:----------------------|:----------|:---------------|:---------|:---------------|:----------------|:--------------------|:--------|:---------------|:-------------------|:-----------|:------------------|:------------------------|:-----------------|:------------|:------|:--------------|:----------------|:-----------|:----------------|:-----|:------|:-----------|:--------|:-------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | X | | | X | X | | X | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | |
| 2 | 5 |  |  |  |  |  | X | | X | | | X | | | X | | | | X | | | | X | | | | | | | | | | X | | X | X | | | X | | | | | | | | | X | X | X | X | X | X |
| CyberHarem/guam_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T17:17:41+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T17:32:41+00:00 |
f7b0623743defc554be1d26536694ed8ed6a6fda | jailson23232/renanplay | [
"license:openrail",
"region:us"
] | 2024-01-13T17:18:45+00:00 | {"license": "openrail"} | 2024-01-13T17:20:23+00:00 |
|
4333d8fda7ce5c54cccfbbba597353a66b646554 |
# Dataset Card for Evaluation run of flemmingmiguel/NeuDist-Ro-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [flemmingmiguel/NeuDist-Ro-7B](https://huggingface.co/flemmingmiguel/NeuDist-Ro-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_flemmingmiguel__NeuDist-Ro-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T17:17:37.802131](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__NeuDist-Ro-7B/blob/main/results_2024-01-13T17-17-37.802131.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6564397980772864,
"acc_stderr": 0.03204722059494361,
"acc_norm": 0.6561722533035016,
"acc_norm_stderr": 0.032711908051799604,
"mc1": 0.49571603427172584,
"mc1_stderr": 0.01750285857737128,
"mc2": 0.6493023269912708,
"mc2_stderr": 0.015276465453752726
},
"harness|arc:challenge|25": {
"acc": 0.6877133105802048,
"acc_stderr": 0.013542598541688065,
"acc_norm": 0.712457337883959,
"acc_norm_stderr": 0.013226719056266127
},
"harness|hellaswag|10": {
"acc": 0.6977693686516631,
"acc_stderr": 0.004582861219020889,
"acc_norm": 0.8748257319259112,
"acc_norm_stderr": 0.0033024011069263197
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700914,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700914
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268542,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268542
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083008,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083008
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.015094215699700476,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.015094215699700476
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7543352601156069,
"acc_stderr": 0.023176298203992002,
"acc_norm": 0.7543352601156069,
"acc_norm_stderr": 0.023176298203992002
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42681564245810055,
"acc_stderr": 0.01654240195463191,
"acc_norm": 0.42681564245810055,
"acc_norm_stderr": 0.01654240195463191
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.023788583551658533,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.023788583551658533
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.02982074719142244,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.02982074719142244
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706207,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706207
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.49571603427172584,
"mc1_stderr": 0.01750285857737128,
"mc2": 0.6493023269912708,
"mc2_stderr": 0.015276465453752726
},
"harness|winogrande|5": {
"acc": 0.8208366219415943,
"acc_stderr": 0.010777949156047987
},
"harness|gsm8k|5": {
"acc": 0.709628506444276,
"acc_stderr": 0.01250359248181895
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_flemmingmiguel__NeuDist-Ro-7B | [
"region:us"
] | 2024-01-13T17:19:56+00:00 | {"pretty_name": "Evaluation run of flemmingmiguel/NeuDist-Ro-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [flemmingmiguel/NeuDist-Ro-7B](https://huggingface.co/flemmingmiguel/NeuDist-Ro-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_flemmingmiguel__NeuDist-Ro-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T17:17:37.802131](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__NeuDist-Ro-7B/blob/main/results_2024-01-13T17-17-37.802131.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6564397980772864,\n \"acc_stderr\": 0.03204722059494361,\n \"acc_norm\": 0.6561722533035016,\n \"acc_norm_stderr\": 0.032711908051799604,\n \"mc1\": 0.49571603427172584,\n \"mc1_stderr\": 0.01750285857737128,\n \"mc2\": 0.6493023269912708,\n \"mc2_stderr\": 0.015276465453752726\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6877133105802048,\n \"acc_stderr\": 0.013542598541688065,\n \"acc_norm\": 0.712457337883959,\n \"acc_norm_stderr\": 0.013226719056266127\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6977693686516631,\n \"acc_stderr\": 0.004582861219020889,\n \"acc_norm\": 0.8748257319259112,\n \"acc_norm_stderr\": 0.0033024011069263197\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700914,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700914\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268542,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268542\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083008,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083008\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8550458715596331,\n \"acc_stderr\": 0.015094215699700476,\n \"acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.015094215699700476\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.023176298203992002,\n \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.023176298203992002\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42681564245810055,\n \"acc_stderr\": 0.01654240195463191,\n \"acc_norm\": 0.42681564245810055,\n \"acc_norm_stderr\": 0.01654240195463191\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.023788583551658533,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.023788583551658533\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.49571603427172584,\n \"mc1_stderr\": 0.01750285857737128,\n \"mc2\": 0.6493023269912708,\n \"mc2_stderr\": 0.015276465453752726\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8208366219415943,\n \"acc_stderr\": 0.010777949156047987\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.709628506444276,\n \"acc_stderr\": 0.01250359248181895\n }\n}\n```", "repo_url": "https://huggingface.co/flemmingmiguel/NeuDist-Ro-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|arc:challenge|25_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|gsm8k|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hellaswag|10_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T17-17-37.802131.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["**/details_harness|winogrande|5_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T17-17-37.802131.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T17_17_37.802131", "path": ["results_2024-01-13T17-17-37.802131.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T17-17-37.802131.parquet"]}]}]} | 2024-01-13T17:20:18+00:00 |
8d002b6c3f58f5fb73c0e6dde1c8f9242dfb590f | version-control/arrayblow-1.0-train | [
"region:us"
] | 2024-01-13T17:20:32+00:00 | {"dataset_info": {"features": [{"name": "repo_name", "dtype": "string"}, {"name": "hexsha", "dtype": "string"}, {"name": "code", "dtype": "string"}, {"name": "file_path", "dtype": "string"}, {"name": "api_extract", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7306756, "num_examples": 437}], "download_size": 2407574, "dataset_size": 7306756}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-13T17:20:37+00:00 |
|
b8dec85c46dee5f9217240e5eb51a2797abdd39b | Cshavi/embeddings | [
"region:us"
] | 2024-01-13T17:21:04+00:00 | {} | 2024-01-13T17:21:39+00:00 |
|
f26e6a73877bb3292ba339d7ef3ee169e0eab72e |
# Dataset Card for Evaluation run of kodonho/SolarM-SakuraSolar-SLERP
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kodonho/SolarM-SakuraSolar-SLERP](https://huggingface.co/kodonho/SolarM-SakuraSolar-SLERP) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kodonho__SolarM-SakuraSolar-SLERP",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T17:26:45.129484](https://huggingface.co/datasets/open-llm-leaderboard/details_kodonho__SolarM-SakuraSolar-SLERP/blob/main/results_2024-01-13T17-26-45.129484.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6661841092869378,
"acc_stderr": 0.031618934361982654,
"acc_norm": 0.6670035390095326,
"acc_norm_stderr": 0.03226162867427471,
"mc1": 0.5667074663402693,
"mc1_stderr": 0.01734702445010748,
"mc2": 0.7209867000245426,
"mc2_stderr": 0.014980299607085815
},
"harness|arc:challenge|25": {
"acc": 0.6877133105802048,
"acc_stderr": 0.013542598541688067,
"acc_norm": 0.71160409556314,
"acc_norm_stderr": 0.013238394422428173
},
"harness|hellaswag|10": {
"acc": 0.714797849034057,
"acc_stderr": 0.0045058790846068415,
"acc_norm": 0.8846843258315077,
"acc_norm_stderr": 0.003187497509087418
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.756578947368421,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.756578947368421,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6212765957446809,
"acc_stderr": 0.03170995606040655,
"acc_norm": 0.6212765957446809,
"acc_norm_stderr": 0.03170995606040655
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.04028731532947558,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.04028731532947558
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4973544973544973,
"acc_stderr": 0.02575094967813039,
"acc_norm": 0.4973544973544973,
"acc_norm_stderr": 0.02575094967813039
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8129032258064516,
"acc_stderr": 0.022185710092252252,
"acc_norm": 0.8129032258064516,
"acc_norm_stderr": 0.022185710092252252
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644244,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644244
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465073,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465073
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634332,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634332
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374308,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374308
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.03372343271653062,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.03372343271653062
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632446,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596915,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596915
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.0230866350868414,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.0230866350868414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8045977011494253,
"acc_stderr": 0.014179171373424383,
"acc_norm": 0.8045977011494253,
"acc_norm_stderr": 0.014179171373424383
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39329608938547483,
"acc_stderr": 0.016337268694270105,
"acc_norm": 0.39329608938547483,
"acc_norm_stderr": 0.016337268694270105
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.02463004897982478,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.02463004897982478
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0227797190887334,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0227797190887334
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49478487614080835,
"acc_stderr": 0.012769541449652547,
"acc_norm": 0.49478487614080835,
"acc_norm_stderr": 0.012769541449652547
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.026679252270103128,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.026679252270103128
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069446,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069446
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338733,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338733
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5667074663402693,
"mc1_stderr": 0.01734702445010748,
"mc2": 0.7209867000245426,
"mc2_stderr": 0.014980299607085815
},
"harness|winogrande|5": {
"acc": 0.8310970797158642,
"acc_stderr": 0.010529981411838899
},
"harness|gsm8k|5": {
"acc": 0.6467020470053071,
"acc_stderr": 0.013166337192115683
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_kodonho__SolarM-SakuraSolar-SLERP | [
"region:us"
] | 2024-01-13T17:29:01+00:00 | {"pretty_name": "Evaluation run of kodonho/SolarM-SakuraSolar-SLERP", "dataset_summary": "Dataset automatically created during the evaluation run of model [kodonho/SolarM-SakuraSolar-SLERP](https://huggingface.co/kodonho/SolarM-SakuraSolar-SLERP) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kodonho__SolarM-SakuraSolar-SLERP\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T17:26:45.129484](https://huggingface.co/datasets/open-llm-leaderboard/details_kodonho__SolarM-SakuraSolar-SLERP/blob/main/results_2024-01-13T17-26-45.129484.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6661841092869378,\n \"acc_stderr\": 0.031618934361982654,\n \"acc_norm\": 0.6670035390095326,\n \"acc_norm_stderr\": 0.03226162867427471,\n \"mc1\": 0.5667074663402693,\n \"mc1_stderr\": 0.01734702445010748,\n \"mc2\": 0.7209867000245426,\n \"mc2_stderr\": 0.014980299607085815\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6877133105802048,\n \"acc_stderr\": 0.013542598541688067,\n \"acc_norm\": 0.71160409556314,\n \"acc_norm_stderr\": 0.013238394422428173\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.714797849034057,\n \"acc_stderr\": 0.0045058790846068415,\n \"acc_norm\": 0.8846843258315077,\n \"acc_norm_stderr\": 0.003187497509087418\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6212765957446809,\n \"acc_stderr\": 0.03170995606040655,\n \"acc_norm\": 0.6212765957446809,\n \"acc_norm_stderr\": 0.03170995606040655\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.04028731532947558,\n \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.04028731532947558\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4973544973544973,\n \"acc_stderr\": 0.02575094967813039,\n \"acc_norm\": 0.4973544973544973,\n \"acc_norm_stderr\": 0.02575094967813039\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8129032258064516,\n \"acc_stderr\": 0.022185710092252252,\n \"acc_norm\": 0.8129032258064516,\n \"acc_norm_stderr\": 0.022185710092252252\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644244,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644244\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465073,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465073\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634332,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634332\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374308,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374308\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.03372343271653062,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.03372343271653062\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596915,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596915\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.0230866350868414,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.0230866350868414\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n \"acc_stderr\": 0.014179171373424383,\n \"acc_norm\": 0.8045977011494253,\n \"acc_norm_stderr\": 0.014179171373424383\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.023083658586984204,\n \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.023083658586984204\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39329608938547483,\n \"acc_stderr\": 0.016337268694270105,\n \"acc_norm\": 0.39329608938547483,\n \"acc_norm_stderr\": 0.016337268694270105\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.02463004897982478,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.02463004897982478\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0227797190887334,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0227797190887334\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49478487614080835,\n \"acc_stderr\": 0.012769541449652547,\n \"acc_norm\": 0.49478487614080835,\n \"acc_norm_stderr\": 0.012769541449652547\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.026679252270103128,\n \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.026679252270103128\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069446,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069446\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338733,\n \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338733\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5667074663402693,\n \"mc1_stderr\": 0.01734702445010748,\n \"mc2\": 0.7209867000245426,\n \"mc2_stderr\": 0.014980299607085815\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8310970797158642,\n \"acc_stderr\": 0.010529981411838899\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6467020470053071,\n \"acc_stderr\": 0.013166337192115683\n }\n}\n```", "repo_url": "https://huggingface.co/kodonho/SolarM-SakuraSolar-SLERP", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|arc:challenge|25_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|gsm8k|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hellaswag|10_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T17-26-45.129484.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["**/details_harness|winogrande|5_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T17-26-45.129484.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T17_26_45.129484", "path": ["results_2024-01-13T17-26-45.129484.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T17-26-45.129484.parquet"]}]}]} | 2024-01-13T17:29:22+00:00 |
7d0c0d94e45731858af0830be29ef6fd91826abd |
# Dataset Card for Evaluation run of kodonho/Solar-OrcaDPO-Solar-Instruct-SLERP
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kodonho/Solar-OrcaDPO-Solar-Instruct-SLERP](https://huggingface.co/kodonho/Solar-OrcaDPO-Solar-Instruct-SLERP) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kodonho__Solar-OrcaDPO-Solar-Instruct-SLERP",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T17:32:35.779900](https://huggingface.co/datasets/open-llm-leaderboard/details_kodonho__Solar-OrcaDPO-Solar-Instruct-SLERP/blob/main/results_2024-01-13T17-32-35.779900.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6659359197324706,
"acc_stderr": 0.03167249441105516,
"acc_norm": 0.6667077779566729,
"acc_norm_stderr": 0.032318448519432046,
"mc1": 0.5679314565483476,
"mc1_stderr": 0.017341202394988327,
"mc2": 0.7195437123974021,
"mc2_stderr": 0.01500878766115849
},
"harness|arc:challenge|25": {
"acc": 0.6808873720136519,
"acc_stderr": 0.013621696119173306,
"acc_norm": 0.7098976109215017,
"acc_norm_stderr": 0.013261573677520766
},
"harness|hellaswag|10": {
"acc": 0.7105158334993029,
"acc_stderr": 0.004525960965551707,
"acc_norm": 0.882194781915953,
"acc_norm_stderr": 0.003217184906847944
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6340425531914894,
"acc_stderr": 0.031489558297455304,
"acc_norm": 0.6340425531914894,
"acc_norm_stderr": 0.031489558297455304
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419036,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419036
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47354497354497355,
"acc_stderr": 0.025715239811346758,
"acc_norm": 0.47354497354497355,
"acc_norm_stderr": 0.025715239811346758
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.022331707611823078,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.022331707611823078
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.02366435940288023,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.02366435940288023
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857406,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857406
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7226890756302521,
"acc_stderr": 0.02907937453948001,
"acc_norm": 0.7226890756302521,
"acc_norm_stderr": 0.02907937453948001
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242741,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242741
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.0338517797604481,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.0338517797604481
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931796,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931796
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8649789029535865,
"acc_stderr": 0.022245776632003694,
"acc_norm": 0.8649789029535865,
"acc_norm_stderr": 0.022245776632003694
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.0230866350868414,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.0230866350868414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7630057803468208,
"acc_stderr": 0.02289408248992599,
"acc_norm": 0.7630057803468208,
"acc_norm_stderr": 0.02289408248992599
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39664804469273746,
"acc_stderr": 0.016361354769822475,
"acc_norm": 0.39664804469273746,
"acc_norm_stderr": 0.016361354769822475
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.023016705640262192,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.023016705640262192
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5141843971631206,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.5141843971631206,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49608865710560623,
"acc_stderr": 0.012769845366441194,
"acc_norm": 0.49608865710560623,
"acc_norm_stderr": 0.012769845366441194
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.0265565194700415,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.0265565194700415
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.01866335967146366,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.01866335967146366
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598052,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598052
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5679314565483476,
"mc1_stderr": 0.017341202394988327,
"mc2": 0.7195437123974021,
"mc2_stderr": 0.01500878766115849
},
"harness|winogrande|5": {
"acc": 0.8342541436464088,
"acc_stderr": 0.010450899545370634
},
"harness|gsm8k|5": {
"acc": 0.6527672479150872,
"acc_stderr": 0.013113898382146877
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_kodonho__Solar-OrcaDPO-Solar-Instruct-SLERP | [
"region:us"
] | 2024-01-13T17:34:57+00:00 | {"pretty_name": "Evaluation run of kodonho/Solar-OrcaDPO-Solar-Instruct-SLERP", "dataset_summary": "Dataset automatically created during the evaluation run of model [kodonho/Solar-OrcaDPO-Solar-Instruct-SLERP](https://huggingface.co/kodonho/Solar-OrcaDPO-Solar-Instruct-SLERP) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kodonho__Solar-OrcaDPO-Solar-Instruct-SLERP\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T17:32:35.779900](https://huggingface.co/datasets/open-llm-leaderboard/details_kodonho__Solar-OrcaDPO-Solar-Instruct-SLERP/blob/main/results_2024-01-13T17-32-35.779900.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6659359197324706,\n \"acc_stderr\": 0.03167249441105516,\n \"acc_norm\": 0.6667077779566729,\n \"acc_norm_stderr\": 0.032318448519432046,\n \"mc1\": 0.5679314565483476,\n \"mc1_stderr\": 0.017341202394988327,\n \"mc2\": 0.7195437123974021,\n \"mc2_stderr\": 0.01500878766115849\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6808873720136519,\n \"acc_stderr\": 0.013621696119173306,\n \"acc_norm\": 0.7098976109215017,\n \"acc_norm_stderr\": 0.013261573677520766\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7105158334993029,\n \"acc_stderr\": 0.004525960965551707,\n \"acc_norm\": 0.882194781915953,\n \"acc_norm_stderr\": 0.003217184906847944\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6340425531914894,\n \"acc_stderr\": 0.031489558297455304,\n \"acc_norm\": 0.6340425531914894,\n \"acc_norm_stderr\": 0.031489558297455304\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419036,\n \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419036\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.47354497354497355,\n \"acc_stderr\": 0.025715239811346758,\n \"acc_norm\": 0.47354497354497355,\n \"acc_norm_stderr\": 0.025715239811346758\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n \"acc_stderr\": 0.022331707611823078,\n \"acc_norm\": 0.8096774193548387,\n \"acc_norm_stderr\": 0.022331707611823078\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8737373737373737,\n \"acc_stderr\": 0.02366435940288023,\n \"acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.02366435940288023\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857406,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857406\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7226890756302521,\n \"acc_stderr\": 0.02907937453948001,\n \"acc_norm\": 0.7226890756302521,\n \"acc_norm_stderr\": 0.02907937453948001\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5601851851851852,\n \"acc_stderr\": 0.0338517797604481,\n \"acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.0338517797604481\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931796,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931796\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8649789029535865,\n \"acc_stderr\": 0.022245776632003694,\n \"acc_norm\": 0.8649789029535865,\n \"acc_norm_stderr\": 0.022245776632003694\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.0230866350868414,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.0230866350868414\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7630057803468208,\n \"acc_stderr\": 0.02289408248992599,\n \"acc_norm\": 0.7630057803468208,\n \"acc_norm_stderr\": 0.02289408248992599\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39664804469273746,\n \"acc_stderr\": 0.016361354769822475,\n \"acc_norm\": 0.39664804469273746,\n \"acc_norm_stderr\": 0.016361354769822475\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.023016705640262192,\n \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.023016705640262192\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5141843971631206,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.5141843971631206,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49608865710560623,\n \"acc_stderr\": 0.012769845366441194,\n \"acc_norm\": 0.49608865710560623,\n \"acc_norm_stderr\": 0.012769845366441194\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.0265565194700415,\n \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.0265565194700415\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.01866335967146366,\n \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.01866335967146366\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.031267817146631786,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.031267817146631786\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5679314565483476,\n \"mc1_stderr\": 0.017341202394988327,\n \"mc2\": 0.7195437123974021,\n \"mc2_stderr\": 0.01500878766115849\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8342541436464088,\n \"acc_stderr\": 0.010450899545370634\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6527672479150872,\n \"acc_stderr\": 0.013113898382146877\n }\n}\n```", "repo_url": "https://huggingface.co/kodonho/Solar-OrcaDPO-Solar-Instruct-SLERP", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|arc:challenge|25_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|gsm8k|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hellaswag|10_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T17-32-35.779900.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["**/details_harness|winogrande|5_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T17-32-35.779900.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T17_32_35.779900", "path": ["results_2024-01-13T17-32-35.779900.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T17-32-35.779900.parquet"]}]}]} | 2024-01-13T17:35:19+00:00 |
b2a292050a3edab5957ecf96ac6a4abd44617711 |
# Dataset: PhilEO Downstream Tasks
A novel 400GB Sentinel-2 dataset of the PhilEO Bench containing labels for the three downstream tasks of building density estimation, road segmentation, and land cover classification.
## Dataset Details
### Dataset Description
The PhilEO dataset is a 400GB global dataset of Sentinel-2 images and has labels for roads, buildings, and land cover, where these are the three downstream tasks. The data is sampled from geographically diverse regions around the globe including: Denmark, East Africa, Egypt, Guinea, Europe, Ghana, Israel, Japan, Nigeria, North America, Senegal, South America, Tanzania, and Uganda. Each region has up to 200 tiles of varying sizes. Some locations have been revisited up to 3 times.
The data contain 11 bands at 10m resolution in the following order: 0-SCL, 1-B02, 2-B03, 3-B04, 4-B08, 5-B05, 6-B06, 7-B07, 8-B8A, 9-B11, and 10-B12 where SCL is the Scene Classification Layer.
- **Curated by:** ESA Phi-lab
- **License:** MIT
## Uses
The dataset can be used to evaluate any EO Foundation Model.
### Dataset Sources
The basic links for the dataset:
- **Repository:** http://huggingface.co/datasets/ESA-philab/PhilEO-downstream
- **Paper:** http://arxiv.org/pdf/2401.04464.pdf
- **Project Website:** http://phileo-bench.github.io
- **Code GitHub:** http://github.com/ESA-PhiLab/PhilEO-Bench
- **Dataset also in:** http://www.eotdl.com/datasets/PhilEO-downstream
- **arXiv:** http://arxiv.org/abs/2401.04464
## Citation
Casper Fibaek, Luke Camilleri, Andreas Luyts, Nikolaos Dionelis, and Bertrand Le Saux, “PhilEO Bench: Evaluating Geo-Spatial Foundation Models,” arXiv:2401.04464, 2024.
| ESA-philab/PhilEO-downstream | [
"license:mit",
"arxiv:2401.04464",
"region:us"
] | 2024-01-13T17:39:24+00:00 | {"license": "mit"} | 2024-02-03T13:41:09+00:00 |
9f5072bb6b336b441251ce471b4b5fac3f4e2768 | zukoochan/tripleSNXT | [
"region:us"
] | 2024-01-13T17:41:37+00:00 | {} | 2024-01-13T17:47:48+00:00 |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.