sha
stringlengths
40
40
text
stringlengths
1
13.4M
id
stringlengths
2
117
tags
sequencelengths
1
7.91k
created_at
stringlengths
25
25
metadata
stringlengths
2
875k
last_modified
stringlengths
25
25
arxiv
sequencelengths
0
25
languages
sequencelengths
0
7.91k
tags_str
stringlengths
17
159k
text_str
stringlengths
1
447k
text_lists
sequencelengths
0
352
processed_texts
sequencelengths
1
353
aa0819b4cdd8a2a2683739853a2fd56b26e42f30
# Dataset Card for Evaluation run of jingyeom/KoSoLAR-10.7B-v0.2_1.4_dedup <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [jingyeom/KoSoLAR-10.7B-v0.2_1.4_dedup](https://huggingface.co/jingyeom/KoSoLAR-10.7B-v0.2_1.4_dedup) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_jingyeom__KoSoLAR-10.7B-v0.2_1.4_dedup", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-12T08:24:49.664707](https://huggingface.co/datasets/open-llm-leaderboard/details_jingyeom__KoSoLAR-10.7B-v0.2_1.4_dedup/blob/main/results_2024-02-12T08-24-49.664707.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.604711622779494, "acc_stderr": 0.032467251718581315, "acc_norm": 0.6163715674554368, "acc_norm_stderr": 0.03334110735529664, "mc1": 0.2937576499388005, "mc1_stderr": 0.015945068581236618, "mc2": 0.45375623938134657, "mc2_stderr": 0.015248519436290428 }, "harness|arc:challenge|25": { "acc": 0.5750853242320819, "acc_stderr": 0.014445698968520763, "acc_norm": 0.6006825938566553, "acc_norm_stderr": 0.014312094557946704 }, "harness|hellaswag|10": { "acc": 0.6222863971320454, "acc_stderr": 0.004838246410786262, "acc_norm": 0.8218482374029078, "acc_norm_stderr": 0.0038185843846355303 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5555555555555556, "acc_stderr": 0.04292596718256981, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.04292596718256981 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6907894736842105, "acc_stderr": 0.037610708698674805, "acc_norm": 0.6907894736842105, "acc_norm_stderr": 0.037610708698674805 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001974, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6754716981132075, "acc_stderr": 0.02881561571343211, "acc_norm": 0.6754716981132075, "acc_norm_stderr": 0.02881561571343211 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7222222222222222, "acc_stderr": 0.03745554791462456, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.03745554791462456 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956911, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.04878317312145632, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5780346820809249, "acc_stderr": 0.0376574669386515, "acc_norm": 0.5780346820809249, "acc_norm_stderr": 0.0376574669386515 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.37254901960784315, "acc_stderr": 0.048108401480826346, "acc_norm": 0.37254901960784315, "acc_norm_stderr": 0.048108401480826346 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.574468085106383, "acc_stderr": 0.03232146916224468, "acc_norm": 0.574468085106383, "acc_norm_stderr": 0.03232146916224468 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.47368421052631576, "acc_stderr": 0.046970851366478626, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.046970851366478626 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.503448275862069, "acc_stderr": 0.04166567577101579, "acc_norm": 0.503448275862069, "acc_norm_stderr": 0.04166567577101579 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.35978835978835977, "acc_stderr": 0.02471807594412928, "acc_norm": 0.35978835978835977, "acc_norm_stderr": 0.02471807594412928 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.38095238095238093, "acc_stderr": 0.04343525428949098, "acc_norm": 0.38095238095238093, "acc_norm_stderr": 0.04343525428949098 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.04793724854411019, "acc_norm": 0.35, "acc_norm_stderr": 0.04793724854411019 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7064516129032258, "acc_stderr": 0.025906087021319295, "acc_norm": 0.7064516129032258, "acc_norm_stderr": 0.025906087021319295 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4433497536945813, "acc_stderr": 0.034953345821629345, "acc_norm": 0.4433497536945813, "acc_norm_stderr": 0.034953345821629345 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8, "acc_stderr": 0.031234752377721175, "acc_norm": 0.8, "acc_norm_stderr": 0.031234752377721175 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.0291265228345868, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.0291265228345868 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8393782383419689, "acc_stderr": 0.02649905770139744, "acc_norm": 0.8393782383419689, "acc_norm_stderr": 0.02649905770139744 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6076923076923076, "acc_stderr": 0.02475600038213095, "acc_norm": 0.6076923076923076, "acc_norm_stderr": 0.02475600038213095 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.27037037037037037, "acc_stderr": 0.02708037281514565, "acc_norm": 0.27037037037037037, "acc_norm_stderr": 0.02708037281514565 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6428571428571429, "acc_stderr": 0.031124619309328177, "acc_norm": 0.6428571428571429, "acc_norm_stderr": 0.031124619309328177 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31125827814569534, "acc_stderr": 0.03780445850526732, "acc_norm": 0.31125827814569534, "acc_norm_stderr": 0.03780445850526732 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7743119266055046, "acc_stderr": 0.017923087667803064, "acc_norm": 0.7743119266055046, "acc_norm_stderr": 0.017923087667803064 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5740740740740741, "acc_stderr": 0.033723432716530624, "acc_norm": 0.5740740740740741, "acc_norm_stderr": 0.033723432716530624 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8333333333333334, "acc_stderr": 0.026156867523931048, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.026156867523931048 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8227848101265823, "acc_stderr": 0.02485636418450322, "acc_norm": 0.8227848101265823, "acc_norm_stderr": 0.02485636418450322 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6771300448430493, "acc_stderr": 0.031381476375754995, "acc_norm": 0.6771300448430493, "acc_norm_stderr": 0.031381476375754995 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.648854961832061, "acc_stderr": 0.04186445163013751, "acc_norm": 0.648854961832061, "acc_norm_stderr": 0.04186445163013751 }, "harness|hendrycksTest-international_law|5": { "acc": 0.743801652892562, "acc_stderr": 0.03984979653302872, "acc_norm": 0.743801652892562, "acc_norm_stderr": 0.03984979653302872 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.75, "acc_stderr": 0.04186091791394607, "acc_norm": 0.75, "acc_norm_stderr": 0.04186091791394607 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7116564417177914, "acc_stderr": 0.035590395316173425, "acc_norm": 0.7116564417177914, "acc_norm_stderr": 0.035590395316173425 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.41964285714285715, "acc_stderr": 0.046840993210771065, "acc_norm": 0.41964285714285715, "acc_norm_stderr": 0.046840993210771065 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8760683760683761, "acc_stderr": 0.021586494001281376, "acc_norm": 0.8760683760683761, "acc_norm_stderr": 0.021586494001281376 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7982120051085568, "acc_stderr": 0.014351702181636863, "acc_norm": 0.7982120051085568, "acc_norm_stderr": 0.014351702181636863 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7052023121387283, "acc_stderr": 0.024547617794803828, "acc_norm": 0.7052023121387283, "acc_norm_stderr": 0.024547617794803828 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.28938547486033517, "acc_stderr": 0.015166544550490298, "acc_norm": 0.28938547486033517, "acc_norm_stderr": 0.015166544550490298 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6797385620915033, "acc_stderr": 0.026716118380156847, "acc_norm": 0.6797385620915033, "acc_norm_stderr": 0.026716118380156847 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6816720257234726, "acc_stderr": 0.026457225067811032, "acc_norm": 0.6816720257234726, "acc_norm_stderr": 0.026457225067811032 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6882716049382716, "acc_stderr": 0.025773111169630457, "acc_norm": 0.6882716049382716, "acc_norm_stderr": 0.025773111169630457 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4645390070921986, "acc_stderr": 0.029752389657427047, "acc_norm": 0.4645390070921986, "acc_norm_stderr": 0.029752389657427047 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4667535853976532, "acc_stderr": 0.012741974333897224, "acc_norm": 0.4667535853976532, "acc_norm_stderr": 0.012741974333897224 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6654411764705882, "acc_stderr": 0.0286619962023353, "acc_norm": 0.6654411764705882, "acc_norm_stderr": 0.0286619962023353 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6258169934640523, "acc_stderr": 0.019576953122088837, "acc_norm": 0.6258169934640523, "acc_norm_stderr": 0.019576953122088837 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.04461272175910508, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.04461272175910508 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6979591836734694, "acc_stderr": 0.029393609319879804, "acc_norm": 0.6979591836734694, "acc_norm_stderr": 0.029393609319879804 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7860696517412935, "acc_stderr": 0.02899690969332891, "acc_norm": 0.7860696517412935, "acc_norm_stderr": 0.02899690969332891 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.9, "acc_stderr": 0.03015113445777634, "acc_norm": 0.9, "acc_norm_stderr": 0.03015113445777634 }, "harness|hendrycksTest-virology|5": { "acc": 0.5060240963855421, "acc_stderr": 0.03892212195333045, "acc_norm": 0.5060240963855421, "acc_norm_stderr": 0.03892212195333045 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7719298245614035, "acc_stderr": 0.03218093795602357, "acc_norm": 0.7719298245614035, "acc_norm_stderr": 0.03218093795602357 }, "harness|truthfulqa:mc|0": { "mc1": 0.2937576499388005, "mc1_stderr": 0.015945068581236618, "mc2": 0.45375623938134657, "mc2_stderr": 0.015248519436290428 }, "harness|winogrande|5": { "acc": 0.7466456195737964, "acc_stderr": 0.012223754434233626 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_jingyeom__KoSoLAR-10.7B-v0.2_1.4_dedup
[ "region:us" ]
2024-02-12T08:27:10+00:00
{"pretty_name": "Evaluation run of jingyeom/KoSoLAR-10.7B-v0.2_1.4_dedup", "dataset_summary": "Dataset automatically created during the evaluation run of model [jingyeom/KoSoLAR-10.7B-v0.2_1.4_dedup](https://huggingface.co/jingyeom/KoSoLAR-10.7B-v0.2_1.4_dedup) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jingyeom__KoSoLAR-10.7B-v0.2_1.4_dedup\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T08:24:49.664707](https://huggingface.co/datasets/open-llm-leaderboard/details_jingyeom__KoSoLAR-10.7B-v0.2_1.4_dedup/blob/main/results_2024-02-12T08-24-49.664707.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.604711622779494,\n \"acc_stderr\": 0.032467251718581315,\n \"acc_norm\": 0.6163715674554368,\n \"acc_norm_stderr\": 0.03334110735529664,\n \"mc1\": 0.2937576499388005,\n \"mc1_stderr\": 0.015945068581236618,\n \"mc2\": 0.45375623938134657,\n \"mc2_stderr\": 0.015248519436290428\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5750853242320819,\n \"acc_stderr\": 0.014445698968520763,\n \"acc_norm\": 0.6006825938566553,\n \"acc_norm_stderr\": 0.014312094557946704\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6222863971320454,\n \"acc_stderr\": 0.004838246410786262,\n \"acc_norm\": 0.8218482374029078,\n \"acc_norm_stderr\": 0.0038185843846355303\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.03745554791462456,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03745554791462456\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.35978835978835977,\n \"acc_stderr\": 0.02471807594412928,\n \"acc_norm\": 0.35978835978835977,\n \"acc_norm_stderr\": 0.02471807594412928\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7064516129032258,\n \"acc_stderr\": 0.025906087021319295,\n \"acc_norm\": 0.7064516129032258,\n \"acc_norm_stderr\": 0.025906087021319295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.034953345821629345,\n \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.034953345821629345\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721175,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721175\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.0291265228345868,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.0291265228345868\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.02649905770139744,\n \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.02649905770139744\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.02475600038213095,\n \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.02475600038213095\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.27037037037037037,\n \"acc_stderr\": 0.02708037281514565,\n \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.02708037281514565\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7743119266055046,\n \"acc_stderr\": 0.017923087667803064,\n \"acc_norm\": 0.7743119266055046,\n \"acc_norm_stderr\": 0.017923087667803064\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931048,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931048\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8227848101265823,\n \"acc_stderr\": 0.02485636418450322,\n \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.02485636418450322\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281376,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281376\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7982120051085568,\n \"acc_stderr\": 0.014351702181636863,\n \"acc_norm\": 0.7982120051085568,\n \"acc_norm_stderr\": 0.014351702181636863\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28938547486033517,\n \"acc_stderr\": 0.015166544550490298,\n \"acc_norm\": 0.28938547486033517,\n \"acc_norm_stderr\": 0.015166544550490298\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.026716118380156847,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.026716118380156847\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n \"acc_stderr\": 0.026457225067811032,\n \"acc_norm\": 0.6816720257234726,\n \"acc_norm_stderr\": 0.026457225067811032\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.025773111169630457,\n \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.025773111169630457\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n \"acc_stderr\": 0.012741974333897224,\n \"acc_norm\": 0.4667535853976532,\n \"acc_norm_stderr\": 0.012741974333897224\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.0286619962023353,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.0286619962023353\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6258169934640523,\n \"acc_stderr\": 0.019576953122088837,\n \"acc_norm\": 0.6258169934640523,\n \"acc_norm_stderr\": 0.019576953122088837\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910508,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910508\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.029393609319879804,\n \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.029393609319879804\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7860696517412935,\n \"acc_stderr\": 0.02899690969332891,\n \"acc_norm\": 0.7860696517412935,\n \"acc_norm_stderr\": 0.02899690969332891\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.03015113445777634,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.03015113445777634\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.03218093795602357,\n \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.03218093795602357\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2937576499388005,\n \"mc1_stderr\": 0.015945068581236618,\n \"mc2\": 0.45375623938134657,\n \"mc2_stderr\": 0.015248519436290428\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7466456195737964,\n \"acc_stderr\": 0.012223754434233626\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/jingyeom/KoSoLAR-10.7B-v0.2_1.4_dedup", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|arc:challenge|25_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|gsm8k|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hellaswag|10_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T08-24-49.664707.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["**/details_harness|winogrande|5_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T08-24-49.664707.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T08_24_49.664707", "path": ["results_2024-02-12T08-24-49.664707.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T08-24-49.664707.parquet"]}]}]}
2024-02-12T08:27:30+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of jingyeom/KoSoLAR-10.7B-v0.2_1.4_dedup Dataset automatically created during the evaluation run of model jingyeom/KoSoLAR-10.7B-v0.2_1.4_dedup on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-12T08:24:49.664707(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of jingyeom/KoSoLAR-10.7B-v0.2_1.4_dedup\n\n\n\nDataset automatically created during the evaluation run of model jingyeom/KoSoLAR-10.7B-v0.2_1.4_dedup on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T08:24:49.664707(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of jingyeom/KoSoLAR-10.7B-v0.2_1.4_dedup\n\n\n\nDataset automatically created during the evaluation run of model jingyeom/KoSoLAR-10.7B-v0.2_1.4_dedup on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T08:24:49.664707(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
7c3dee547f79175df5e165059c5418173eed0b15
# Dataset Card for Evaluation run of eren23/dpo-binarized-NeutrixOmnibe-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [eren23/dpo-binarized-NeutrixOmnibe-7B](https://huggingface.co/eren23/dpo-binarized-NeutrixOmnibe-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_eren23__dpo-binarized-NeutrixOmnibe-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-12T08:32:59.622106](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__dpo-binarized-NeutrixOmnibe-7B/blob/main/results_2024-02-12T08-32-59.622106.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6524193795014644, "acc_stderr": 0.03203241158000217, "acc_norm": 0.6515007521390985, "acc_norm_stderr": 0.03270726703205656, "mc1": 0.6242350061199511, "mc1_stderr": 0.01695458406021429, "mc2": 0.7690068996757015, "mc2_stderr": 0.013938036873589639 }, "harness|arc:challenge|25": { "acc": 0.7167235494880546, "acc_stderr": 0.013167478735134575, "acc_norm": 0.7278156996587031, "acc_norm_stderr": 0.013006600406423702 }, "harness|hellaswag|10": { "acc": 0.7152957578171679, "acc_stderr": 0.0045035118550500325, "acc_norm": 0.8904600677155945, "acc_norm_stderr": 0.003116771577319422 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6444444444444445, "acc_stderr": 0.04135176749720385, "acc_norm": 0.6444444444444445, "acc_norm_stderr": 0.04135176749720385 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7105263157894737, "acc_stderr": 0.03690677986137283, "acc_norm": 0.7105263157894737, "acc_norm_stderr": 0.03690677986137283 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6981132075471698, "acc_stderr": 0.028254200344438662, "acc_norm": 0.6981132075471698, "acc_norm_stderr": 0.028254200344438662 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.653179190751445, "acc_stderr": 0.036291466701596636, "acc_norm": 0.653179190751445, "acc_norm_stderr": 0.036291466701596636 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.048971049527263666, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.048971049527263666 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.574468085106383, "acc_stderr": 0.03232146916224468, "acc_norm": 0.574468085106383, "acc_norm_stderr": 0.03232146916224468 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4649122807017544, "acc_stderr": 0.046920083813689104, "acc_norm": 0.4649122807017544, "acc_norm_stderr": 0.046920083813689104 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4126984126984127, "acc_stderr": 0.025355741263055273, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.025355741263055273 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5, "acc_stderr": 0.04472135954999579, "acc_norm": 0.5, "acc_norm_stderr": 0.04472135954999579 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7838709677419354, "acc_stderr": 0.023415293433568525, "acc_norm": 0.7838709677419354, "acc_norm_stderr": 0.023415293433568525 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5123152709359606, "acc_stderr": 0.035169204442208966, "acc_norm": 0.5123152709359606, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.803030303030303, "acc_stderr": 0.028335609732463362, "acc_norm": 0.803030303030303, "acc_norm_stderr": 0.028335609732463362 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9119170984455959, "acc_stderr": 0.02045374660160103, "acc_norm": 0.9119170984455959, "acc_norm_stderr": 0.02045374660160103 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6615384615384615, "acc_stderr": 0.023991500500313036, "acc_norm": 0.6615384615384615, "acc_norm_stderr": 0.023991500500313036 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32592592592592595, "acc_stderr": 0.028578348365473082, "acc_norm": 0.32592592592592595, "acc_norm_stderr": 0.028578348365473082 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.030388353551886797, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.030388353551886797 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3708609271523179, "acc_stderr": 0.03943966699183629, "acc_norm": 0.3708609271523179, "acc_norm_stderr": 0.03943966699183629 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8440366972477065, "acc_stderr": 0.01555580271359017, "acc_norm": 0.8440366972477065, "acc_norm_stderr": 0.01555580271359017 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5046296296296297, "acc_stderr": 0.03409825519163572, "acc_norm": 0.5046296296296297, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8431372549019608, "acc_stderr": 0.02552472232455335, "acc_norm": 0.8431372549019608, "acc_norm_stderr": 0.02552472232455335 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.810126582278481, "acc_stderr": 0.02553010046023349, "acc_norm": 0.810126582278481, "acc_norm_stderr": 0.02553010046023349 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8091603053435115, "acc_stderr": 0.03446513350752598, "acc_norm": 0.8091603053435115, "acc_norm_stderr": 0.03446513350752598 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7603305785123967, "acc_stderr": 0.03896878985070416, "acc_norm": 0.7603305785123967, "acc_norm_stderr": 0.03896878985070416 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252626, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252626 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7852760736196319, "acc_stderr": 0.032262193772867744, "acc_norm": 0.7852760736196319, "acc_norm_stderr": 0.032262193772867744 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04697113923010212, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04697113923010212 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406964, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406964 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8263090676883781, "acc_stderr": 0.01354741565866226, "acc_norm": 0.8263090676883781, "acc_norm_stderr": 0.01354741565866226 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7312138728323699, "acc_stderr": 0.023868003262500104, "acc_norm": 0.7312138728323699, "acc_norm_stderr": 0.023868003262500104 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.43910614525139663, "acc_stderr": 0.016598022120580428, "acc_norm": 0.43910614525139663, "acc_norm_stderr": 0.016598022120580428 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7222222222222222, "acc_stderr": 0.025646863097137897, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.025646863097137897 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7170418006430869, "acc_stderr": 0.02558306248998481, "acc_norm": 0.7170418006430869, "acc_norm_stderr": 0.02558306248998481 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7376543209876543, "acc_stderr": 0.024477222856135114, "acc_norm": 0.7376543209876543, "acc_norm_stderr": 0.024477222856135114 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4858156028368794, "acc_stderr": 0.02981549448368206, "acc_norm": 0.4858156028368794, "acc_norm_stderr": 0.02981549448368206 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4706649282920469, "acc_stderr": 0.012748238397365549, "acc_norm": 0.4706649282920469, "acc_norm_stderr": 0.012748238397365549 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6801470588235294, "acc_stderr": 0.02833295951403121, "acc_norm": 0.6801470588235294, "acc_norm_stderr": 0.02833295951403121 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6683006535947712, "acc_stderr": 0.01904748523936038, "acc_norm": 0.6683006535947712, "acc_norm_stderr": 0.01904748523936038 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.025870646766169136, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.025870646766169136 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.03588702812826371, "acc_norm": 0.85, "acc_norm_stderr": 0.03588702812826371 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.03869543323472101, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.6242350061199511, "mc1_stderr": 0.01695458406021429, "mc2": 0.7690068996757015, "mc2_stderr": 0.013938036873589639 }, "harness|winogrande|5": { "acc": 0.850828729281768, "acc_stderr": 0.010012598805627297 }, "harness|gsm8k|5": { "acc": 0.6944655041698257, "acc_stderr": 0.012688134076726884 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_eren23__dpo-binarized-NeutrixOmnibe-7B
[ "region:us" ]
2024-02-12T08:35:18+00:00
{"pretty_name": "Evaluation run of eren23/dpo-binarized-NeutrixOmnibe-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [eren23/dpo-binarized-NeutrixOmnibe-7B](https://huggingface.co/eren23/dpo-binarized-NeutrixOmnibe-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_eren23__dpo-binarized-NeutrixOmnibe-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T08:32:59.622106](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__dpo-binarized-NeutrixOmnibe-7B/blob/main/results_2024-02-12T08-32-59.622106.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6524193795014644,\n \"acc_stderr\": 0.03203241158000217,\n \"acc_norm\": 0.6515007521390985,\n \"acc_norm_stderr\": 0.03270726703205656,\n \"mc1\": 0.6242350061199511,\n \"mc1_stderr\": 0.01695458406021429,\n \"mc2\": 0.7690068996757015,\n \"mc2_stderr\": 0.013938036873589639\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7167235494880546,\n \"acc_stderr\": 0.013167478735134575,\n \"acc_norm\": 0.7278156996587031,\n \"acc_norm_stderr\": 0.013006600406423702\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7152957578171679,\n \"acc_stderr\": 0.0045035118550500325,\n \"acc_norm\": 0.8904600677155945,\n \"acc_norm_stderr\": 0.003116771577319422\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438662,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438662\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055273,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055273\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473082,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473082\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886797,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886797\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43910614525139663,\n \"acc_stderr\": 0.016598022120580428,\n \"acc_norm\": 0.43910614525139663,\n \"acc_norm_stderr\": 0.016598022120580428\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6242350061199511,\n \"mc1_stderr\": 0.01695458406021429,\n \"mc2\": 0.7690068996757015,\n \"mc2_stderr\": 0.013938036873589639\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.850828729281768,\n \"acc_stderr\": 0.010012598805627297\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6944655041698257,\n \"acc_stderr\": 0.012688134076726884\n }\n}\n```", "repo_url": "https://huggingface.co/eren23/dpo-binarized-NeutrixOmnibe-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|arc:challenge|25_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|gsm8k|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hellaswag|10_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T08-32-59.622106.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["**/details_harness|winogrande|5_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T08-32-59.622106.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T08_32_59.622106", "path": ["results_2024-02-12T08-32-59.622106.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T08-32-59.622106.parquet"]}]}]}
2024-02-12T08:35:39+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of eren23/dpo-binarized-NeutrixOmnibe-7B Dataset automatically created during the evaluation run of model eren23/dpo-binarized-NeutrixOmnibe-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-12T08:32:59.622106(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of eren23/dpo-binarized-NeutrixOmnibe-7B\n\n\n\nDataset automatically created during the evaluation run of model eren23/dpo-binarized-NeutrixOmnibe-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T08:32:59.622106(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of eren23/dpo-binarized-NeutrixOmnibe-7B\n\n\n\nDataset automatically created during the evaluation run of model eren23/dpo-binarized-NeutrixOmnibe-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T08:32:59.622106(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
54648b7a090428499433c85575329ec61be54617
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-mixtral-7bx8-v18.1-32k <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-mixtral-7bx8-v18.1-32k](https://huggingface.co/OpenBuddy/openbuddy-mixtral-7bx8-v18.1-32k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-mixtral-7bx8-v18.1-32k", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-12T08:49:35.524384](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-mixtral-7bx8-v18.1-32k/blob/main/results_2024-02-12T08-49-35.524384.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7079066229639335, "acc_stderr": 0.03049945480767806, "acc_norm": 0.7110631168796062, "acc_norm_stderr": 0.031098742932734718, "mc1": 0.4039167686658507, "mc1_stderr": 0.017177276822584284, "mc2": 0.567198448780328, "mc2_stderr": 0.01504385546272261 }, "harness|arc:challenge|25": { "acc": 0.6416382252559727, "acc_stderr": 0.01401288333485986, "acc_norm": 0.6766211604095563, "acc_norm_stderr": 0.013669421630012136 }, "harness|hellaswag|10": { "acc": 0.6464847639912368, "acc_stderr": 0.0047708386783560375, "acc_norm": 0.8429595698068114, "acc_norm_stderr": 0.0036309529998437276 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6592592592592592, "acc_stderr": 0.04094376269996793, "acc_norm": 0.6592592592592592, "acc_norm_stderr": 0.04094376269996793 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7763157894736842, "acc_stderr": 0.03391160934343604, "acc_norm": 0.7763157894736842, "acc_norm_stderr": 0.03391160934343604 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7886792452830189, "acc_stderr": 0.025125766484827845, "acc_norm": 0.7886792452830189, "acc_norm_stderr": 0.025125766484827845 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8402777777777778, "acc_stderr": 0.030635578972093278, "acc_norm": 0.8402777777777778, "acc_norm_stderr": 0.030635578972093278 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.61, "acc_stderr": 0.049020713000019756, "acc_norm": 0.61, "acc_norm_stderr": 0.049020713000019756 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.7167630057803468, "acc_stderr": 0.03435568056047875, "acc_norm": 0.7167630057803468, "acc_norm_stderr": 0.03435568056047875 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.8, "acc_stderr": 0.04020151261036846, "acc_norm": 0.8, "acc_norm_stderr": 0.04020151261036846 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6468085106382979, "acc_stderr": 0.031245325202761926, "acc_norm": 0.6468085106382979, "acc_norm_stderr": 0.031245325202761926 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.631578947368421, "acc_stderr": 0.04537815354939391, "acc_norm": 0.631578947368421, "acc_norm_stderr": 0.04537815354939391 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6689655172413793, "acc_stderr": 0.03921545312467122, "acc_norm": 0.6689655172413793, "acc_norm_stderr": 0.03921545312467122 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.48148148148148145, "acc_stderr": 0.025733641991838987, "acc_norm": 0.48148148148148145, "acc_norm_stderr": 0.025733641991838987 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5555555555555556, "acc_stderr": 0.04444444444444449, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.04444444444444449 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8483870967741935, "acc_stderr": 0.02040261665441676, "acc_norm": 0.8483870967741935, "acc_norm_stderr": 0.02040261665441676 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.6206896551724138, "acc_stderr": 0.034139638059062345, "acc_norm": 0.6206896551724138, "acc_norm_stderr": 0.034139638059062345 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8181818181818182, "acc_stderr": 0.030117688929503582, "acc_norm": 0.8181818181818182, "acc_norm_stderr": 0.030117688929503582 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8585858585858586, "acc_stderr": 0.024825909793343343, "acc_norm": 0.8585858585858586, "acc_norm_stderr": 0.024825909793343343 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9378238341968912, "acc_stderr": 0.017426974154240528, "acc_norm": 0.9378238341968912, "acc_norm_stderr": 0.017426974154240528 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.7051282051282052, "acc_stderr": 0.0231193627582323, "acc_norm": 0.7051282051282052, "acc_norm_stderr": 0.0231193627582323 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.337037037037037, "acc_stderr": 0.028820884666253252, "acc_norm": 0.337037037037037, "acc_norm_stderr": 0.028820884666253252 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.8151260504201681, "acc_stderr": 0.025215992877954202, "acc_norm": 0.8151260504201681, "acc_norm_stderr": 0.025215992877954202 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.5298013245033113, "acc_stderr": 0.040752249922169775, "acc_norm": 0.5298013245033113, "acc_norm_stderr": 0.040752249922169775 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8752293577981651, "acc_stderr": 0.014168298359156326, "acc_norm": 0.8752293577981651, "acc_norm_stderr": 0.014168298359156326 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.6296296296296297, "acc_stderr": 0.03293377139415191, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.03293377139415191 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8627450980392157, "acc_stderr": 0.024152225962801588, "acc_norm": 0.8627450980392157, "acc_norm_stderr": 0.024152225962801588 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8860759493670886, "acc_stderr": 0.020681745135884562, "acc_norm": 0.8860759493670886, "acc_norm_stderr": 0.020681745135884562 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7713004484304933, "acc_stderr": 0.028188240046929203, "acc_norm": 0.7713004484304933, "acc_norm_stderr": 0.028188240046929203 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.816793893129771, "acc_stderr": 0.03392770926494732, "acc_norm": 0.816793893129771, "acc_norm_stderr": 0.03392770926494732 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8512396694214877, "acc_stderr": 0.032484700838071943, "acc_norm": 0.8512396694214877, "acc_norm_stderr": 0.032484700838071943 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8148148148148148, "acc_stderr": 0.03755265865037181, "acc_norm": 0.8148148148148148, "acc_norm_stderr": 0.03755265865037181 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7914110429447853, "acc_stderr": 0.031921934489347235, "acc_norm": 0.7914110429447853, "acc_norm_stderr": 0.031921934489347235 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5, "acc_stderr": 0.04745789978762494, "acc_norm": 0.5, "acc_norm_stderr": 0.04745789978762494 }, "harness|hendrycksTest-management|5": { "acc": 0.8446601941747572, "acc_stderr": 0.03586594738573974, "acc_norm": 0.8446601941747572, "acc_norm_stderr": 0.03586594738573974 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9188034188034188, "acc_stderr": 0.017893784904018536, "acc_norm": 0.9188034188034188, "acc_norm_stderr": 0.017893784904018536 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.77, "acc_stderr": 0.04229525846816506, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8825031928480205, "acc_stderr": 0.011515102251977207, "acc_norm": 0.8825031928480205, "acc_norm_stderr": 0.011515102251977207 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7890173410404624, "acc_stderr": 0.021966309947043117, "acc_norm": 0.7890173410404624, "acc_norm_stderr": 0.021966309947043117 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.5083798882681564, "acc_stderr": 0.016720152794672486, "acc_norm": 0.5083798882681564, "acc_norm_stderr": 0.016720152794672486 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.826797385620915, "acc_stderr": 0.021668400256514272, "acc_norm": 0.826797385620915, "acc_norm_stderr": 0.021668400256514272 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7877813504823151, "acc_stderr": 0.023222756797435098, "acc_norm": 0.7877813504823151, "acc_norm_stderr": 0.023222756797435098 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8024691358024691, "acc_stderr": 0.022152889927898965, "acc_norm": 0.8024691358024691, "acc_norm_stderr": 0.022152889927898965 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5531914893617021, "acc_stderr": 0.02965823509766691, "acc_norm": 0.5531914893617021, "acc_norm_stderr": 0.02965823509766691 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.529986962190352, "acc_stderr": 0.012747248967079053, "acc_norm": 0.529986962190352, "acc_norm_stderr": 0.012747248967079053 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7757352941176471, "acc_stderr": 0.025336848563332376, "acc_norm": 0.7757352941176471, "acc_norm_stderr": 0.025336848563332376 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.761437908496732, "acc_stderr": 0.017242385828779606, "acc_norm": 0.761437908496732, "acc_norm_stderr": 0.017242385828779606 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7, "acc_stderr": 0.04389311454644287, "acc_norm": 0.7, "acc_norm_stderr": 0.04389311454644287 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7877551020408163, "acc_stderr": 0.026176967197866764, "acc_norm": 0.7877551020408163, "acc_norm_stderr": 0.026176967197866764 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8855721393034826, "acc_stderr": 0.022509345325101716, "acc_norm": 0.8855721393034826, "acc_norm_stderr": 0.022509345325101716 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.89, "acc_stderr": 0.03144660377352201, "acc_norm": 0.89, "acc_norm_stderr": 0.03144660377352201 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8538011695906432, "acc_stderr": 0.027097290118070813, "acc_norm": 0.8538011695906432, "acc_norm_stderr": 0.027097290118070813 }, "harness|truthfulqa:mc|0": { "mc1": 0.4039167686658507, "mc1_stderr": 0.017177276822584284, "mc2": 0.567198448780328, "mc2_stderr": 0.01504385546272261 }, "harness|winogrande|5": { "acc": 0.8097868981846882, "acc_stderr": 0.01103033579861744 }, "harness|gsm8k|5": { "acc": 0.6512509476876421, "acc_stderr": 0.013127227055035861 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_OpenBuddy__openbuddy-mixtral-7bx8-v18.1-32k
[ "region:us" ]
2024-02-12T08:51:51+00:00
{"pretty_name": "Evaluation run of OpenBuddy/openbuddy-mixtral-7bx8-v18.1-32k", "dataset_summary": "Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-mixtral-7bx8-v18.1-32k](https://huggingface.co/OpenBuddy/openbuddy-mixtral-7bx8-v18.1-32k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-mixtral-7bx8-v18.1-32k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T08:49:35.524384](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-mixtral-7bx8-v18.1-32k/blob/main/results_2024-02-12T08-49-35.524384.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7079066229639335,\n \"acc_stderr\": 0.03049945480767806,\n \"acc_norm\": 0.7110631168796062,\n \"acc_norm_stderr\": 0.031098742932734718,\n \"mc1\": 0.4039167686658507,\n \"mc1_stderr\": 0.017177276822584284,\n \"mc2\": 0.567198448780328,\n \"mc2_stderr\": 0.01504385546272261\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6416382252559727,\n \"acc_stderr\": 0.01401288333485986,\n \"acc_norm\": 0.6766211604095563,\n \"acc_norm_stderr\": 0.013669421630012136\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6464847639912368,\n \"acc_stderr\": 0.0047708386783560375,\n \"acc_norm\": 0.8429595698068114,\n \"acc_norm_stderr\": 0.0036309529998437276\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.04094376269996793,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.04094376269996793\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7763157894736842,\n \"acc_stderr\": 0.03391160934343604,\n \"acc_norm\": 0.7763157894736842,\n \"acc_norm_stderr\": 0.03391160934343604\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7886792452830189,\n \"acc_stderr\": 0.025125766484827845,\n \"acc_norm\": 0.7886792452830189,\n \"acc_norm_stderr\": 0.025125766484827845\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8402777777777778,\n \"acc_stderr\": 0.030635578972093278,\n \"acc_norm\": 0.8402777777777778,\n \"acc_norm_stderr\": 0.030635578972093278\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.049020713000019756,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.049020713000019756\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.03435568056047875,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.03435568056047875\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6468085106382979,\n \"acc_stderr\": 0.031245325202761926,\n \"acc_norm\": 0.6468085106382979,\n \"acc_norm_stderr\": 0.031245325202761926\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.04537815354939391,\n \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.04537815354939391\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6689655172413793,\n \"acc_stderr\": 0.03921545312467122,\n \"acc_norm\": 0.6689655172413793,\n \"acc_norm_stderr\": 0.03921545312467122\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.025733641991838987,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.025733641991838987\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8483870967741935,\n \"acc_stderr\": 0.02040261665441676,\n \"acc_norm\": 0.8483870967741935,\n \"acc_norm_stderr\": 0.02040261665441676\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.034139638059062345,\n \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.034139638059062345\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.030117688929503582,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.030117688929503582\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8585858585858586,\n \"acc_stderr\": 0.024825909793343343,\n \"acc_norm\": 0.8585858585858586,\n \"acc_norm_stderr\": 0.024825909793343343\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528,\n \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240528\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7051282051282052,\n \"acc_stderr\": 0.0231193627582323,\n \"acc_norm\": 0.7051282051282052,\n \"acc_norm_stderr\": 0.0231193627582323\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253252,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253252\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8151260504201681,\n \"acc_stderr\": 0.025215992877954202,\n \"acc_norm\": 0.8151260504201681,\n \"acc_norm_stderr\": 0.025215992877954202\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5298013245033113,\n \"acc_stderr\": 0.040752249922169775,\n \"acc_norm\": 0.5298013245033113,\n \"acc_norm_stderr\": 0.040752249922169775\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8752293577981651,\n \"acc_stderr\": 0.014168298359156326,\n \"acc_norm\": 0.8752293577981651,\n \"acc_norm_stderr\": 0.014168298359156326\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.03293377139415191,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.03293377139415191\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8627450980392157,\n \"acc_stderr\": 0.024152225962801588,\n \"acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.024152225962801588\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8860759493670886,\n \"acc_stderr\": 0.020681745135884562,\n \"acc_norm\": 0.8860759493670886,\n \"acc_norm_stderr\": 0.020681745135884562\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7713004484304933,\n \"acc_stderr\": 0.028188240046929203,\n \"acc_norm\": 0.7713004484304933,\n \"acc_norm_stderr\": 0.028188240046929203\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494732,\n \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494732\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8512396694214877,\n \"acc_stderr\": 0.032484700838071943,\n \"acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.032484700838071943\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n \"acc_stderr\": 0.017893784904018536,\n \"acc_norm\": 0.9188034188034188,\n \"acc_norm_stderr\": 0.017893784904018536\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8825031928480205,\n \"acc_stderr\": 0.011515102251977207,\n \"acc_norm\": 0.8825031928480205,\n \"acc_norm_stderr\": 0.011515102251977207\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7890173410404624,\n \"acc_stderr\": 0.021966309947043117,\n \"acc_norm\": 0.7890173410404624,\n \"acc_norm_stderr\": 0.021966309947043117\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5083798882681564,\n \"acc_stderr\": 0.016720152794672486,\n \"acc_norm\": 0.5083798882681564,\n \"acc_norm_stderr\": 0.016720152794672486\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.826797385620915,\n \"acc_stderr\": 0.021668400256514272,\n \"acc_norm\": 0.826797385620915,\n \"acc_norm_stderr\": 0.021668400256514272\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7877813504823151,\n \"acc_stderr\": 0.023222756797435098,\n \"acc_norm\": 0.7877813504823151,\n \"acc_norm_stderr\": 0.023222756797435098\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8024691358024691,\n \"acc_stderr\": 0.022152889927898965,\n \"acc_norm\": 0.8024691358024691,\n \"acc_norm_stderr\": 0.022152889927898965\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.02965823509766691,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.02965823509766691\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.529986962190352,\n \"acc_stderr\": 0.012747248967079053,\n \"acc_norm\": 0.529986962190352,\n \"acc_norm_stderr\": 0.012747248967079053\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7757352941176471,\n \"acc_stderr\": 0.025336848563332376,\n \"acc_norm\": 0.7757352941176471,\n \"acc_norm_stderr\": 0.025336848563332376\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.017242385828779606,\n \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.017242385828779606\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7877551020408163,\n \"acc_stderr\": 0.026176967197866764,\n \"acc_norm\": 0.7877551020408163,\n \"acc_norm_stderr\": 0.026176967197866764\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n \"acc_stderr\": 0.022509345325101716,\n \"acc_norm\": 0.8855721393034826,\n \"acc_norm_stderr\": 0.022509345325101716\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352201,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352201\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070813,\n \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070813\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4039167686658507,\n \"mc1_stderr\": 0.017177276822584284,\n \"mc2\": 0.567198448780328,\n \"mc2_stderr\": 0.01504385546272261\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8097868981846882,\n \"acc_stderr\": 0.01103033579861744\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6512509476876421,\n \"acc_stderr\": 0.013127227055035861\n }\n}\n```", "repo_url": "https://huggingface.co/OpenBuddy/openbuddy-mixtral-7bx8-v18.1-32k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|arc:challenge|25_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|gsm8k|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hellaswag|10_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T08-49-35.524384.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["**/details_harness|winogrande|5_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T08-49-35.524384.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T08_49_35.524384", "path": ["results_2024-02-12T08-49-35.524384.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T08-49-35.524384.parquet"]}]}]}
2024-02-12T08:52:13+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-mixtral-7bx8-v18.1-32k Dataset automatically created during the evaluation run of model OpenBuddy/openbuddy-mixtral-7bx8-v18.1-32k on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-12T08:49:35.524384(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of OpenBuddy/openbuddy-mixtral-7bx8-v18.1-32k\n\n\n\nDataset automatically created during the evaluation run of model OpenBuddy/openbuddy-mixtral-7bx8-v18.1-32k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T08:49:35.524384(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of OpenBuddy/openbuddy-mixtral-7bx8-v18.1-32k\n\n\n\nDataset automatically created during the evaluation run of model OpenBuddy/openbuddy-mixtral-7bx8-v18.1-32k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T08:49:35.524384(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
4e61b0bd1e851f83bfa8e1e070d3c669c68d6ded
# Dataset Card for "promptflow" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
treezy254/promptflow
[ "region:us" ]
2024-02-12T09:08:09+00:00
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "id", "dtype": "string"}, {"name": "metadata", "struct": [{"name": "file_path", "dtype": "string"}, {"name": "repo_id", "dtype": "string"}, {"name": "token_count", "dtype": "int64"}]}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 7390235, "num_examples": 417}], "download_size": 1930559, "dataset_size": 7390235}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-02-12T09:08:10+00:00
[]
[]
TAGS #region-us
# Dataset Card for "promptflow" More Information needed
[ "# Dataset Card for \"promptflow\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"promptflow\"\n\nMore Information needed" ]
d36596a0ccfee22dbcceaf4f347b8b1731a260b7
# Indonesian Corpus ## Description This dataset contains a corpus in the Indonesian language taken from [Korpus Indonesia](https://korpusindonesia.kemdikbud.go.id/), provided by the Ministry of Education and Culture of the Republic of Indonesia. The corpus is a collection of text in sentence format covering various fields of study, such as Social, Health, Literature, Opinion, Sports, Culture, and others. ## Contents The dataset consists of texts in the Indonesian language grouped based on specific fields of study or topics. Each text is a collection of sentences sourced from the aforementioned provider. ## Usage This dataset can be used for various research and development purposes in the field of natural language processing (NLP), text analysis, text classification, and other research that requires text data in the Indonesian language. ## License The dataset is retrieved from [Korpus Indonesia](https://korpusindonesia.kemdikbud.go.id/) provided by the Ministry of Education and Culture of the Republic of Indonesia. Please make sure to check and comply with the applicable licensing terms from the original source. ## References For more information about Korpus Indonesia, please visit [https://korpusindonesia.kemdikbud.go.id/](https://korpusindonesia.kemdikbud.go.id/).
DamarJati/indocorpus-mix
[ "task_categories:text2text-generation", "size_categories:10K<n<100K", "language:id", "corpus", "indonesia", "text", "parquet", "region:us" ]
2024-02-12T09:30:22+00:00
{"language": ["id"], "size_categories": ["10K<n<100K"], "task_categories": ["text2text-generation"], "pretty_name": "Corpus Indonesia", "tags": ["corpus", "indonesia", "text", "parquet"]}
2024-02-12T22:52:05+00:00
[]
[ "id" ]
TAGS #task_categories-text2text-generation #size_categories-10K<n<100K #language-Indonesian #corpus #indonesia #text #parquet #region-us
# Indonesian Corpus ## Description This dataset contains a corpus in the Indonesian language taken from Korpus Indonesia, provided by the Ministry of Education and Culture of the Republic of Indonesia. The corpus is a collection of text in sentence format covering various fields of study, such as Social, Health, Literature, Opinion, Sports, Culture, and others. ## Contents The dataset consists of texts in the Indonesian language grouped based on specific fields of study or topics. Each text is a collection of sentences sourced from the aforementioned provider. ## Usage This dataset can be used for various research and development purposes in the field of natural language processing (NLP), text analysis, text classification, and other research that requires text data in the Indonesian language. ## License The dataset is retrieved from Korpus Indonesia provided by the Ministry of Education and Culture of the Republic of Indonesia. Please make sure to check and comply with the applicable licensing terms from the original source. ## References For more information about Korpus Indonesia, please visit URL
[ "# Indonesian Corpus", "## Description\nThis dataset contains a corpus in the Indonesian language taken from Korpus Indonesia, provided by the Ministry of Education and Culture of the Republic of Indonesia. The corpus is a collection of text in sentence format covering various fields of study, such as Social, Health, Literature, Opinion, Sports, Culture, and others.", "## Contents\nThe dataset consists of texts in the Indonesian language grouped based on specific fields of study or topics. Each text is a collection of sentences sourced from the aforementioned provider.", "## Usage\nThis dataset can be used for various research and development purposes in the field of natural language processing (NLP), text analysis, text classification, and other research that requires text data in the Indonesian language.", "## License\nThe dataset is retrieved from Korpus Indonesia provided by the Ministry of Education and Culture of the Republic of Indonesia. Please make sure to check and comply with the applicable licensing terms from the original source.", "## References\nFor more information about Korpus Indonesia, please visit URL" ]
[ "TAGS\n#task_categories-text2text-generation #size_categories-10K<n<100K #language-Indonesian #corpus #indonesia #text #parquet #region-us \n", "# Indonesian Corpus", "## Description\nThis dataset contains a corpus in the Indonesian language taken from Korpus Indonesia, provided by the Ministry of Education and Culture of the Republic of Indonesia. The corpus is a collection of text in sentence format covering various fields of study, such as Social, Health, Literature, Opinion, Sports, Culture, and others.", "## Contents\nThe dataset consists of texts in the Indonesian language grouped based on specific fields of study or topics. Each text is a collection of sentences sourced from the aforementioned provider.", "## Usage\nThis dataset can be used for various research and development purposes in the field of natural language processing (NLP), text analysis, text classification, and other research that requires text data in the Indonesian language.", "## License\nThe dataset is retrieved from Korpus Indonesia provided by the Ministry of Education and Culture of the Republic of Indonesia. Please make sure to check and comply with the applicable licensing terms from the original source.", "## References\nFor more information about Korpus Indonesia, please visit URL" ]
80823498abe4b4e2fc7139b026c07bca60c7ab3b
# Dataset Card for "domsdatabasen" ## Dataset Description - **Point of Contact:** [Oliver Kinch](mailto:[email protected]) - **Size of dataset:** 199 MB ### Dataset Summary [Domsdatabasen](https://domsdatabasen.dk/) is a database where you can find and read selected judgments delivered by the Danish Courts. Each judgment/case consists of tabular data and a case-descriptive PDF. This dataset collects all these cases, with each sample describing a specific judgment/case. The PDFs are anonymized to protect sensitive information. Therefore, each sample includes two text versions: - `text_anon` (with anonymization tags: \<anonym\>"Some sensitive text"\</anonym\>). - `text` (without anonymization tags). `text_anon` is read with [Easyocr](https://github.com/JaidedAI/EasyOCR). `text` is read with [Easyocr](https://github.com/JaidedAI/EasyOCR) or [Tika-python](https://github.com/chrismattmann/tika-python) depending on the PDF and the anonymization method used. `text_anon` will be empty if no anonymization is detected in the PDF. ### Languages The dataset is available in Danish (`da`). ## Dataset Structure An example from the dataset looks as follows. ``` { "case_id": "id of case/judgment", "tabular_data": { "Overskift": "some title", "Sagstype": "type of case", ... } "text": "pdf text", "text_anon": "anonymized pdf text" "text_len": <number of chars in text>, "text_anon_len": <number of chars in anonymized text> } ``` ### Data Fields - `case_id`: a `string` feature. - `tabular_data`: a `dict` feature. - `text`: a `string` feature. - `text_anon`: a `string` feature. - `text_len`: an `int` feature. - `text_anon_len`: an `int` feature. ### Dataset Statistics #### Size of dataset With the PDF texts being provided in two versions, `text` and `text_anon`, the total size of all PDF texts is approximately ~199//2 MB. #### Number of samples - 3919 #### PDF Text Length Distribution Statistics based on `text`. - Minimum length: 192 - Maximum length: 2101736 ![image/png](https://cdn-uploads.huggingface.co/production/uploads/61e0713ac50610f535ed2c88/YTBH-nSHd2b4z6LIjeMF-.png) ## Potential Dataset Issues See [open issues](https://github.com/oliverkinch/doms_databasen/issues). ## Dataset Creation ### Curation Rationale There are not many large-scale law datasets in Danish. ### Source Data The dataset has been scraped from [Domsdatabasen](https://domsdatabasen.dk/). ## Additional Information ### Dataset Curators [Oliver Kinch](https://huggingface.co/oliverkinch) from the [The Alexandra Institute](https://alexandra.dk/) ### Licensing Information The dataset is licensed under the [CC0 license](https://creativecommons.org/share-your-work/public-domain/cc0/).
oliverkinch/domsdatabasen
[ "region:us" ]
2024-02-12T09:38:24+00:00
{"dataset_info": {"features": [{"name": "case_id", "dtype": "string"}, {"name": "tabular_data", "struct": [{"name": "Overskrift", "dtype": "string"}, {"name": "Afg\u00f8relsesstatus", "dtype": "string"}, {"name": "Faggruppe", "dtype": "string"}, {"name": "Ret", "dtype": "string"}, {"name": "Rettens sagsnummer", "dtype": "string"}, {"name": "Sagstype", "dtype": "string"}, {"name": "Instans", "dtype": "string"}, {"name": "Domsdatabasens sagsnummer", "dtype": "string"}, {"name": "Sagsemner", "dtype": "string"}, {"name": "S\u00e6rlige retsskridt", "dtype": "string"}, {"name": "Sagsdeltagere", "dtype": "string"}, {"name": "D\u00f8rlukning", "dtype": "string"}, {"name": "L\u00f8ftet ud af sm\u00e5sagsprocessen", "dtype": "string"}, {"name": "Anerkendelsesp\u00e5stand", "dtype": "string"}, {"name": "Politiets journalnummer", "dtype": "string"}, {"name": "P\u00e5standsbel\u00f8b", "dtype": "string"}, {"name": "Sagskomplekser", "dtype": "string"}]}, {"name": "text", "dtype": "string"}, {"name": "text_anonymized", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 193530504, "num_examples": 3917}], "download_size": 96393615, "dataset_size": 193530504}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-02-13T08:45:36+00:00
[]
[]
TAGS #region-us
# Dataset Card for "domsdatabasen" ## Dataset Description - Point of Contact: Oliver Kinch - Size of dataset: 199 MB ### Dataset Summary Domsdatabasen is a database where you can find and read selected judgments delivered by the Danish Courts. Each judgment/case consists of tabular data and a case-descriptive PDF. This dataset collects all these cases, with each sample describing a specific judgment/case. The PDFs are anonymized to protect sensitive information. Therefore, each sample includes two text versions: - 'text_anon' (with anonymization tags: \<anonym\>"Some sensitive text"\</anonym\>). - 'text' (without anonymization tags). 'text_anon' is read with Easyocr. 'text' is read with Easyocr or Tika-python depending on the PDF and the anonymization method used. 'text_anon' will be empty if no anonymization is detected in the PDF. ### Languages The dataset is available in Danish ('da'). ## Dataset Structure An example from the dataset looks as follows. ### Data Fields - 'case_id': a 'string' feature. - 'tabular_data': a 'dict' feature. - 'text': a 'string' feature. - 'text_anon': a 'string' feature. - 'text_len': an 'int' feature. - 'text_anon_len': an 'int' feature. ### Dataset Statistics #### Size of dataset With the PDF texts being provided in two versions, 'text' and 'text_anon', the total size of all PDF texts is approximately ~199//2 MB. #### Number of samples - 3919 #### PDF Text Length Distribution Statistics based on 'text'. - Minimum length: 192 - Maximum length: 2101736 !image/png ## Potential Dataset Issues See open issues. ## Dataset Creation ### Curation Rationale There are not many large-scale law datasets in Danish. ### Source Data The dataset has been scraped from Domsdatabasen. ## Additional Information ### Dataset Curators Oliver Kinch from the The Alexandra Institute ### Licensing Information The dataset is licensed under the CC0 license.
[ "# Dataset Card for \"domsdatabasen\"", "## Dataset Description\n\n- Point of Contact: Oliver Kinch\n- Size of dataset: 199 MB", "### Dataset Summary\n\nDomsdatabasen is a database where you can find and read selected judgments delivered by the Danish Courts. \n\nEach judgment/case consists of tabular data and a case-descriptive PDF. This dataset collects all these cases, with each sample describing a specific judgment/case.\n\nThe PDFs are anonymized to protect sensitive information. Therefore, each sample includes two text versions: \n- 'text_anon' (with anonymization tags: \\<anonym\\>\"Some sensitive text\"\\</anonym\\>).\n- 'text' (without anonymization tags).\n\n'text_anon' is read with Easyocr.\n\n'text' is read with Easyocr or Tika-python \ndepending on the PDF and the anonymization method used.\n\n'text_anon' will be empty if no anonymization is detected in the PDF.", "### Languages\n\nThe dataset is available in Danish ('da').", "## Dataset Structure\n\nAn example from the dataset looks as follows.", "### Data Fields\n\n- 'case_id': a 'string' feature.\n- 'tabular_data': a 'dict' feature.\n- 'text': a 'string' feature.\n- 'text_anon': a 'string' feature.\n- 'text_len': an 'int' feature.\n- 'text_anon_len': an 'int' feature.", "### Dataset Statistics", "#### Size of dataset\n\nWith the PDF texts being provided in two versions, 'text' and 'text_anon', the total size of all PDF texts is approximately ~199//2 MB.", "#### Number of samples\n\n- 3919", "#### PDF Text Length Distribution\n\nStatistics based on 'text'.\n\n- Minimum length: 192\n- Maximum length: 2101736\n\n!image/png", "## Potential Dataset Issues\n\nSee open issues.", "## Dataset Creation", "### Curation Rationale\n\nThere are not many large-scale law datasets in Danish.", "### Source Data\n\nThe dataset has been scraped from Domsdatabasen.", "## Additional Information", "### Dataset Curators\n\nOliver Kinch from the The Alexandra\nInstitute", "### Licensing Information\n\nThe dataset is licensed under the CC0\nlicense." ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"domsdatabasen\"", "## Dataset Description\n\n- Point of Contact: Oliver Kinch\n- Size of dataset: 199 MB", "### Dataset Summary\n\nDomsdatabasen is a database where you can find and read selected judgments delivered by the Danish Courts. \n\nEach judgment/case consists of tabular data and a case-descriptive PDF. This dataset collects all these cases, with each sample describing a specific judgment/case.\n\nThe PDFs are anonymized to protect sensitive information. Therefore, each sample includes two text versions: \n- 'text_anon' (with anonymization tags: \\<anonym\\>\"Some sensitive text\"\\</anonym\\>).\n- 'text' (without anonymization tags).\n\n'text_anon' is read with Easyocr.\n\n'text' is read with Easyocr or Tika-python \ndepending on the PDF and the anonymization method used.\n\n'text_anon' will be empty if no anonymization is detected in the PDF.", "### Languages\n\nThe dataset is available in Danish ('da').", "## Dataset Structure\n\nAn example from the dataset looks as follows.", "### Data Fields\n\n- 'case_id': a 'string' feature.\n- 'tabular_data': a 'dict' feature.\n- 'text': a 'string' feature.\n- 'text_anon': a 'string' feature.\n- 'text_len': an 'int' feature.\n- 'text_anon_len': an 'int' feature.", "### Dataset Statistics", "#### Size of dataset\n\nWith the PDF texts being provided in two versions, 'text' and 'text_anon', the total size of all PDF texts is approximately ~199//2 MB.", "#### Number of samples\n\n- 3919", "#### PDF Text Length Distribution\n\nStatistics based on 'text'.\n\n- Minimum length: 192\n- Maximum length: 2101736\n\n!image/png", "## Potential Dataset Issues\n\nSee open issues.", "## Dataset Creation", "### Curation Rationale\n\nThere are not many large-scale law datasets in Danish.", "### Source Data\n\nThe dataset has been scraped from Domsdatabasen.", "## Additional Information", "### Dataset Curators\n\nOliver Kinch from the The Alexandra\nInstitute", "### Licensing Information\n\nThe dataset is licensed under the CC0\nlicense." ]
6a07a9dce31ac3cb4d0ebe8e092475fd19009a28
# Dataset Card for "wsd_myriade_synth_data_v1" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
gguichard/wsd_myriade_synth_data_v1
[ "region:us" ]
2024-02-12T09:42:24+00:00
{"dataset_info": {"features": [{"name": "tokens", "sequence": "string"}, {"name": "wn_sens", "sequence": "int64"}, {"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}, {"name": "labels", "sequence": "float64"}], "splits": [{"name": "train", "num_bytes": 32143597, "num_examples": 54750}], "download_size": 6205293, "dataset_size": 32143597}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-02-12T09:53:23+00:00
[]
[]
TAGS #region-us
# Dataset Card for "wsd_myriade_synth_data_v1" More Information needed
[ "# Dataset Card for \"wsd_myriade_synth_data_v1\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"wsd_myriade_synth_data_v1\"\n\nMore Information needed" ]
837cec84dd70cd287557f81a2578bf9dd4ebfafe
# Dataset Card for "wsd_myriade_synth_data_v2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
gguichard/wsd_myriade_synth_data_v2
[ "region:us" ]
2024-02-12T09:53:50+00:00
{"dataset_info": {"features": [{"name": "tokens", "sequence": "string"}, {"name": "wn_sens", "sequence": "int64"}, {"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}, {"name": "labels", "sequence": "float64"}], "splits": [{"name": "train", "num_bytes": 32143597, "num_examples": 54750}], "download_size": 6205293, "dataset_size": 32143597}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-02-12T09:53:54+00:00
[]
[]
TAGS #region-us
# Dataset Card for "wsd_myriade_synth_data_v2" More Information needed
[ "# Dataset Card for \"wsd_myriade_synth_data_v2\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"wsd_myriade_synth_data_v2\"\n\nMore Information needed" ]
c5c0719c41dd430a1704edfdb8b3b492e869634b
# ELI5 question-answer pairs in Danish ## About This dataset is a version of the [ELI5 question-answer pairs dataset](https://huggingface.co/datasets/sentence-transformers/embedding-training-data) machine-translated from English to Danish ([link to original dataset](https://huggingface.co/datasets/eli5)). Machine translation is performed using the Helsinki NLP [English-to-Danish OPUS-MT model](https://huggingface.co/Helsinki-NLP/opus-mt-en-da). The dataset contains ~209k question-answer pairs and can be used to train embedding and question-answer models. Each pair consists of one question ('query') and one passage containing the answer ('passage'). ## Usage Using the HuggingFace datasets library: ```python from datasets import load_dataset dataset = load_dataset("KennethTM/eli5_question_answer_danish") ```
KennethTM/eli5_question_answer_danish
[ "task_categories:feature-extraction", "task_categories:question-answering", "language:da", "license:unknown", "region:us" ]
2024-02-12T09:55:43+00:00
{"language": ["da"], "license": "unknown", "task_categories": ["feature-extraction", "question-answering"], "dataset_info": {"features": [{"name": "query", "dtype": "string"}, {"name": "passage", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 109698512, "num_examples": 209408}], "download_size": 70746762, "dataset_size": 109698512}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-02-12T10:23:31+00:00
[]
[ "da" ]
TAGS #task_categories-feature-extraction #task_categories-question-answering #language-Danish #license-unknown #region-us
# ELI5 question-answer pairs in Danish ## About This dataset is a version of the ELI5 question-answer pairs dataset machine-translated from English to Danish (link to original dataset). Machine translation is performed using the Helsinki NLP English-to-Danish OPUS-MT model. The dataset contains ~209k question-answer pairs and can be used to train embedding and question-answer models. Each pair consists of one question ('query') and one passage containing the answer ('passage'). ## Usage Using the HuggingFace datasets library:
[ "# ELI5 question-answer pairs in Danish", "## About\n\nThis dataset is a version of the ELI5 question-answer pairs dataset machine-translated from English to Danish (link to original dataset).\n\nMachine translation is performed using the Helsinki NLP English-to-Danish OPUS-MT model.\n\nThe dataset contains ~209k question-answer pairs and can be used to train embedding and question-answer models. Each pair consists of one question ('query') and one passage containing the answer ('passage').", "## Usage\n\nUsing the HuggingFace datasets library:" ]
[ "TAGS\n#task_categories-feature-extraction #task_categories-question-answering #language-Danish #license-unknown #region-us \n", "# ELI5 question-answer pairs in Danish", "## About\n\nThis dataset is a version of the ELI5 question-answer pairs dataset machine-translated from English to Danish (link to original dataset).\n\nMachine translation is performed using the Helsinki NLP English-to-Danish OPUS-MT model.\n\nThe dataset contains ~209k question-answer pairs and can be used to train embedding and question-answer models. Each pair consists of one question ('query') and one passage containing the answer ('passage').", "## Usage\n\nUsing the HuggingFace datasets library:" ]
6b63eafda486823a76e73746a739a6c819fec399
# Dataset Card for Evaluation run of TeeZee/DarkForest-20B-v1.2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [TeeZee/DarkForest-20B-v1.2](https://huggingface.co/TeeZee/DarkForest-20B-v1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TeeZee__DarkForest-20B-v1.2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-12T09:58:29.626269](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__DarkForest-20B-v1.2/blob/main/results_2024-02-12T09-58-29.626269.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.59652528447685, "acc_stderr": 0.03291464576716697, "acc_norm": 0.6028408155533549, "acc_norm_stderr": 0.03360637322940083, "mc1": 0.39657282741738065, "mc1_stderr": 0.017124930942023518, "mc2": 0.563147422518037, "mc2_stderr": 0.015977038696027138 }, "harness|arc:challenge|25": { "acc": 0.6117747440273038, "acc_stderr": 0.014241614207414044, "acc_norm": 0.6356655290102389, "acc_norm_stderr": 0.014063260279882419 }, "harness|hellaswag|10": { "acc": 0.6816371240788688, "acc_stderr": 0.0046488907875817, "acc_norm": 0.8641704839673372, "acc_norm_stderr": 0.0034190724807353617 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.046482319871173156, "acc_norm": 0.31, "acc_norm_stderr": 0.046482319871173156 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5111111111111111, "acc_stderr": 0.04318275491977976, "acc_norm": 0.5111111111111111, "acc_norm_stderr": 0.04318275491977976 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6447368421052632, "acc_stderr": 0.038947344870133176, "acc_norm": 0.6447368421052632, "acc_norm_stderr": 0.038947344870133176 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6490566037735849, "acc_stderr": 0.02937364625323469, "acc_norm": 0.6490566037735849, "acc_norm_stderr": 0.02937364625323469 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6736111111111112, "acc_stderr": 0.03921067198982266, "acc_norm": 0.6736111111111112, "acc_norm_stderr": 0.03921067198982266 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.45, "acc_stderr": 0.049999999999999996, "acc_norm": 0.45, "acc_norm_stderr": 0.049999999999999996 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5260115606936416, "acc_stderr": 0.03807301726504513, "acc_norm": 0.5260115606936416, "acc_norm_stderr": 0.03807301726504513 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3431372549019608, "acc_stderr": 0.047240073523838876, "acc_norm": 0.3431372549019608, "acc_norm_stderr": 0.047240073523838876 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5617021276595745, "acc_stderr": 0.032436186361081004, "acc_norm": 0.5617021276595745, "acc_norm_stderr": 0.032436186361081004 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2719298245614035, "acc_stderr": 0.04185774424022056, "acc_norm": 0.2719298245614035, "acc_norm_stderr": 0.04185774424022056 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878151, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878151 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.35978835978835977, "acc_stderr": 0.024718075944129277, "acc_norm": 0.35978835978835977, "acc_norm_stderr": 0.024718075944129277 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.373015873015873, "acc_stderr": 0.04325506042017086, "acc_norm": 0.373015873015873, "acc_norm_stderr": 0.04325506042017086 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6967741935483871, "acc_stderr": 0.026148685930671746, "acc_norm": 0.6967741935483871, "acc_norm_stderr": 0.026148685930671746 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4827586206896552, "acc_stderr": 0.035158955511656986, "acc_norm": 0.4827586206896552, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03346409881055953, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7525252525252525, "acc_stderr": 0.030746300742124498, "acc_norm": 0.7525252525252525, "acc_norm_stderr": 0.030746300742124498 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8808290155440415, "acc_stderr": 0.02338193534812143, "acc_norm": 0.8808290155440415, "acc_norm_stderr": 0.02338193534812143 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6025641025641025, "acc_stderr": 0.024811920017903836, "acc_norm": 0.6025641025641025, "acc_norm_stderr": 0.024811920017903836 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3111111111111111, "acc_stderr": 0.02822644674968352, "acc_norm": 0.3111111111111111, "acc_norm_stderr": 0.02822644674968352 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6596638655462185, "acc_stderr": 0.030778057422931673, "acc_norm": 0.6596638655462185, "acc_norm_stderr": 0.030778057422931673 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7834862385321101, "acc_stderr": 0.017658710594443128, "acc_norm": 0.7834862385321101, "acc_norm_stderr": 0.017658710594443128 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4675925925925926, "acc_stderr": 0.03402801581358966, "acc_norm": 0.4675925925925926, "acc_norm_stderr": 0.03402801581358966 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7990196078431373, "acc_stderr": 0.028125972265654373, "acc_norm": 0.7990196078431373, "acc_norm_stderr": 0.028125972265654373 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7848101265822784, "acc_stderr": 0.02675082699467618, "acc_norm": 0.7848101265822784, "acc_norm_stderr": 0.02675082699467618 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.672645739910314, "acc_stderr": 0.03149384670994131, "acc_norm": 0.672645739910314, "acc_norm_stderr": 0.03149384670994131 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7022900763358778, "acc_stderr": 0.04010358942462203, "acc_norm": 0.7022900763358778, "acc_norm_stderr": 0.04010358942462203 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7520661157024794, "acc_stderr": 0.03941897526516303, "acc_norm": 0.7520661157024794, "acc_norm_stderr": 0.03941897526516303 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.039578354719809784, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.039578354719809784 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7423312883435583, "acc_stderr": 0.03436150827846917, "acc_norm": 0.7423312883435583, "acc_norm_stderr": 0.03436150827846917 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3482142857142857, "acc_stderr": 0.045218299028335865, "acc_norm": 0.3482142857142857, "acc_norm_stderr": 0.045218299028335865 }, "harness|hendrycksTest-management|5": { "acc": 0.7475728155339806, "acc_stderr": 0.04301250399690878, "acc_norm": 0.7475728155339806, "acc_norm_stderr": 0.04301250399690878 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8589743589743589, "acc_stderr": 0.022801382534597528, "acc_norm": 0.8589743589743589, "acc_norm_stderr": 0.022801382534597528 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.789272030651341, "acc_stderr": 0.014583812465862545, "acc_norm": 0.789272030651341, "acc_norm_stderr": 0.014583812465862545 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6763005780346821, "acc_stderr": 0.025190181327608408, "acc_norm": 0.6763005780346821, "acc_norm_stderr": 0.025190181327608408 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2916201117318436, "acc_stderr": 0.015201032512520429, "acc_norm": 0.2916201117318436, "acc_norm_stderr": 0.015201032512520429 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6437908496732027, "acc_stderr": 0.02742047766262924, "acc_norm": 0.6437908496732027, "acc_norm_stderr": 0.02742047766262924 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6913183279742765, "acc_stderr": 0.026236965881153266, "acc_norm": 0.6913183279742765, "acc_norm_stderr": 0.026236965881153266 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7006172839506173, "acc_stderr": 0.025483115601195455, "acc_norm": 0.7006172839506173, "acc_norm_stderr": 0.025483115601195455 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4858156028368794, "acc_stderr": 0.02981549448368206, "acc_norm": 0.4858156028368794, "acc_norm_stderr": 0.02981549448368206 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4589308996088657, "acc_stderr": 0.012727084826799795, "acc_norm": 0.4589308996088657, "acc_norm_stderr": 0.012727084826799795 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5882352941176471, "acc_stderr": 0.029896163033125474, "acc_norm": 0.5882352941176471, "acc_norm_stderr": 0.029896163033125474 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6372549019607843, "acc_stderr": 0.019450768432505518, "acc_norm": 0.6372549019607843, "acc_norm_stderr": 0.019450768432505518 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6090909090909091, "acc_stderr": 0.04673752333670238, "acc_norm": 0.6090909090909091, "acc_norm_stderr": 0.04673752333670238 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7020408163265306, "acc_stderr": 0.029279567411065677, "acc_norm": 0.7020408163265306, "acc_norm_stderr": 0.029279567411065677 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7810945273631841, "acc_stderr": 0.029239174636647, "acc_norm": 0.7810945273631841, "acc_norm_stderr": 0.029239174636647 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.88, "acc_stderr": 0.03265986323710906, "acc_norm": 0.88, "acc_norm_stderr": 0.03265986323710906 }, "harness|hendrycksTest-virology|5": { "acc": 0.4879518072289157, "acc_stderr": 0.03891364495835821, "acc_norm": 0.4879518072289157, "acc_norm_stderr": 0.03891364495835821 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7777777777777778, "acc_stderr": 0.031885780176863984, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.031885780176863984 }, "harness|truthfulqa:mc|0": { "mc1": 0.39657282741738065, "mc1_stderr": 0.017124930942023518, "mc2": 0.563147422518037, "mc2_stderr": 0.015977038696027138 }, "harness|winogrande|5": { "acc": 0.7774269928966061, "acc_stderr": 0.011690933809712664 }, "harness|gsm8k|5": { "acc": 0.2494313874147081, "acc_stderr": 0.011918265218445518 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_TeeZee__DarkForest-20B-v1.2
[ "region:us" ]
2024-02-12T10:00:48+00:00
{"pretty_name": "Evaluation run of TeeZee/DarkForest-20B-v1.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [TeeZee/DarkForest-20B-v1.2](https://huggingface.co/TeeZee/DarkForest-20B-v1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TeeZee__DarkForest-20B-v1.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T09:58:29.626269](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__DarkForest-20B-v1.2/blob/main/results_2024-02-12T09-58-29.626269.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.59652528447685,\n \"acc_stderr\": 0.03291464576716697,\n \"acc_norm\": 0.6028408155533549,\n \"acc_norm_stderr\": 0.03360637322940083,\n \"mc1\": 0.39657282741738065,\n \"mc1_stderr\": 0.017124930942023518,\n \"mc2\": 0.563147422518037,\n \"mc2_stderr\": 0.015977038696027138\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6117747440273038,\n \"acc_stderr\": 0.014241614207414044,\n \"acc_norm\": 0.6356655290102389,\n \"acc_norm_stderr\": 0.014063260279882419\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6816371240788688,\n \"acc_stderr\": 0.0046488907875817,\n \"acc_norm\": 0.8641704839673372,\n \"acc_norm_stderr\": 0.0034190724807353617\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.038947344870133176,\n \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.038947344870133176\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.02937364625323469,\n \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.02937364625323469\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n \"acc_stderr\": 0.03807301726504513,\n \"acc_norm\": 0.5260115606936416,\n \"acc_norm_stderr\": 0.03807301726504513\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838876,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838876\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.032436186361081004,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.032436186361081004\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.35978835978835977,\n \"acc_stderr\": 0.024718075944129277,\n \"acc_norm\": 0.35978835978835977,\n \"acc_norm_stderr\": 0.024718075944129277\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6967741935483871,\n \"acc_stderr\": 0.026148685930671746,\n \"acc_norm\": 0.6967741935483871,\n \"acc_norm_stderr\": 0.026148685930671746\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124498,\n \"acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124498\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6025641025641025,\n \"acc_stderr\": 0.024811920017903836,\n \"acc_norm\": 0.6025641025641025,\n \"acc_norm_stderr\": 0.024811920017903836\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7834862385321101,\n \"acc_stderr\": 0.017658710594443128,\n \"acc_norm\": 0.7834862385321101,\n \"acc_norm_stderr\": 0.017658710594443128\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467618,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467618\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.04010358942462203,\n \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.04010358942462203\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.039578354719809784,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.039578354719809784\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n \"acc_stderr\": 0.045218299028335865,\n \"acc_norm\": 0.3482142857142857,\n \"acc_norm_stderr\": 0.045218299028335865\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.789272030651341,\n \"acc_stderr\": 0.014583812465862545,\n \"acc_norm\": 0.789272030651341,\n \"acc_norm_stderr\": 0.014583812465862545\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.025190181327608408,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.025190181327608408\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2916201117318436,\n \"acc_stderr\": 0.015201032512520429,\n \"acc_norm\": 0.2916201117318436,\n \"acc_norm_stderr\": 0.015201032512520429\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6437908496732027,\n \"acc_stderr\": 0.02742047766262924,\n \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.02742047766262924\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195455,\n \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195455\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4589308996088657,\n \"acc_stderr\": 0.012727084826799795,\n \"acc_norm\": 0.4589308996088657,\n \"acc_norm_stderr\": 0.012727084826799795\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.029896163033125474,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.029896163033125474\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6372549019607843,\n \"acc_stderr\": 0.019450768432505518,\n \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.019450768432505518\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n \"acc_stderr\": 0.04673752333670238,\n \"acc_norm\": 0.6090909090909091,\n \"acc_norm_stderr\": 0.04673752333670238\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065677,\n \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065677\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.031885780176863984,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.031885780176863984\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39657282741738065,\n \"mc1_stderr\": 0.017124930942023518,\n \"mc2\": 0.563147422518037,\n \"mc2_stderr\": 0.015977038696027138\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7774269928966061,\n \"acc_stderr\": 0.011690933809712664\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2494313874147081,\n \"acc_stderr\": 0.011918265218445518\n }\n}\n```", "repo_url": "https://huggingface.co/TeeZee/DarkForest-20B-v1.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|arc:challenge|25_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|gsm8k|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hellaswag|10_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T09-58-29.626269.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["**/details_harness|winogrande|5_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T09-58-29.626269.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T09_58_29.626269", "path": ["results_2024-02-12T09-58-29.626269.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T09-58-29.626269.parquet"]}]}]}
2024-02-12T10:01:09+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of TeeZee/DarkForest-20B-v1.2 Dataset automatically created during the evaluation run of model TeeZee/DarkForest-20B-v1.2 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-12T09:58:29.626269(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of TeeZee/DarkForest-20B-v1.2\n\n\n\nDataset automatically created during the evaluation run of model TeeZee/DarkForest-20B-v1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T09:58:29.626269(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of TeeZee/DarkForest-20B-v1.2\n\n\n\nDataset automatically created during the evaluation run of model TeeZee/DarkForest-20B-v1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T09:58:29.626269(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
27741bb97d099d9ea4c8ff3c61c66db9be1e2b02
# SQuAD question-answer pairs in Danish ## About This dataset is a version of the [SQuAD question-answer pairs dataset](https://huggingface.co/datasets/sentence-transformers/embedding-training-data) machine-translated from English to Danish ([link to original dataset](https://huggingface.co/datasets/squad)). Machine translation is performed using the Helsinki NLP [English-to-Danish OPUS-MT model](https://huggingface.co/Helsinki-NLP/opus-mt-en-da). The dataset contains ~87k question-answer pairs and can be used to train embedding and question-answer models. Each pair consists of one question ('query') and one passage containing the answer ('passage'). ## Usage Using the HuggingFace datasets library: ```python from datasets import load_dataset dataset = load_dataset("KennethTM/squad_pairs_danish") ```
KennethTM/squad_pairs_danish
[ "task_categories:feature-extraction", "task_categories:question-answering", "language:da", "license:cc-by-sa-4.0", "region:us" ]
2024-02-12T10:19:28+00:00
{"language": ["da"], "license": "cc-by-sa-4.0", "task_categories": ["feature-extraction", "question-answering"], "dataset_info": {"features": [{"name": "query", "dtype": "string"}, {"name": "passage", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 69338889, "num_examples": 87599}], "download_size": 11644151, "dataset_size": 69338889}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-02-12T10:23:07+00:00
[]
[ "da" ]
TAGS #task_categories-feature-extraction #task_categories-question-answering #language-Danish #license-cc-by-sa-4.0 #region-us
# SQuAD question-answer pairs in Danish ## About This dataset is a version of the SQuAD question-answer pairs dataset machine-translated from English to Danish (link to original dataset). Machine translation is performed using the Helsinki NLP English-to-Danish OPUS-MT model. The dataset contains ~87k question-answer pairs and can be used to train embedding and question-answer models. Each pair consists of one question ('query') and one passage containing the answer ('passage'). ## Usage Using the HuggingFace datasets library:
[ "# SQuAD question-answer pairs in Danish", "## About\n\nThis dataset is a version of the SQuAD question-answer pairs dataset machine-translated from English to Danish (link to original dataset).\n\nMachine translation is performed using the Helsinki NLP English-to-Danish OPUS-MT model.\n\nThe dataset contains ~87k question-answer pairs and can be used to train embedding and question-answer models. Each pair consists of one question ('query') and one passage containing the answer ('passage').", "## Usage\n\nUsing the HuggingFace datasets library:" ]
[ "TAGS\n#task_categories-feature-extraction #task_categories-question-answering #language-Danish #license-cc-by-sa-4.0 #region-us \n", "# SQuAD question-answer pairs in Danish", "## About\n\nThis dataset is a version of the SQuAD question-answer pairs dataset machine-translated from English to Danish (link to original dataset).\n\nMachine translation is performed using the Helsinki NLP English-to-Danish OPUS-MT model.\n\nThe dataset contains ~87k question-answer pairs and can be used to train embedding and question-answer models. Each pair consists of one question ('query') and one passage containing the answer ('passage').", "## Usage\n\nUsing the HuggingFace datasets library:" ]
b1d5940eac8252584383a37bb5fe07a5230f90f2
# Dataset Card for Evaluation run of Radu1999/Mistral-Instruct-Ukrainian-slerp <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Radu1999/Mistral-Instruct-Ukrainian-slerp](https://huggingface.co/Radu1999/Mistral-Instruct-Ukrainian-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Radu1999__Mistral-Instruct-Ukrainian-slerp", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-12T11:11:52.976201](https://huggingface.co/datasets/open-llm-leaderboard/details_Radu1999__Mistral-Instruct-Ukrainian-slerp/blob/main/results_2024-02-12T11-11-52.976201.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6128348134449864, "acc_stderr": 0.03306039267014507, "acc_norm": 0.6174798445939971, "acc_norm_stderr": 0.033726644979784004, "mc1": 0.4773561811505508, "mc1_stderr": 0.01748554225848965, "mc2": 0.6348683354452056, "mc2_stderr": 0.015251462930296836 }, "harness|arc:challenge|25": { "acc": 0.5767918088737202, "acc_stderr": 0.014438036220848029, "acc_norm": 0.6203071672354948, "acc_norm_stderr": 0.01418211986697487 }, "harness|hellaswag|10": { "acc": 0.6528579964150567, "acc_stderr": 0.004750884401095161, "acc_norm": 0.8434574785899224, "acc_norm_stderr": 0.0036262628054422163 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.04688261722621503, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621503 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5703703703703704, "acc_stderr": 0.042763494943765995, "acc_norm": 0.5703703703703704, "acc_norm_stderr": 0.042763494943765995 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.631578947368421, "acc_stderr": 0.03925523381052932, "acc_norm": 0.631578947368421, "acc_norm_stderr": 0.03925523381052932 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.6, "acc_stderr": 0.049236596391733084, "acc_norm": 0.6, "acc_norm_stderr": 0.049236596391733084 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.690566037735849, "acc_stderr": 0.028450154794118637, "acc_norm": 0.690566037735849, "acc_norm_stderr": 0.028450154794118637 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6875, "acc_stderr": 0.038760854559127644, "acc_norm": 0.6875, "acc_norm_stderr": 0.038760854559127644 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956911, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5780346820809249, "acc_stderr": 0.0376574669386515, "acc_norm": 0.5780346820809249, "acc_norm_stderr": 0.0376574669386515 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.048971049527263666, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.048971049527263666 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5446808510638298, "acc_stderr": 0.03255525359340355, "acc_norm": 0.5446808510638298, "acc_norm_stderr": 0.03255525359340355 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.40350877192982454, "acc_stderr": 0.04615186962583703, "acc_norm": 0.40350877192982454, "acc_norm_stderr": 0.04615186962583703 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6068965517241379, "acc_stderr": 0.0407032901370707, "acc_norm": 0.6068965517241379, "acc_norm_stderr": 0.0407032901370707 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3968253968253968, "acc_stderr": 0.02519710107424649, "acc_norm": 0.3968253968253968, "acc_norm_stderr": 0.02519710107424649 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.40476190476190477, "acc_stderr": 0.04390259265377562, "acc_norm": 0.40476190476190477, "acc_norm_stderr": 0.04390259265377562 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7032258064516129, "acc_stderr": 0.025988500792411898, "acc_norm": 0.7032258064516129, "acc_norm_stderr": 0.025988500792411898 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.64, "acc_stderr": 0.048241815132442176, "acc_norm": 0.64, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7454545454545455, "acc_stderr": 0.03401506715249039, "acc_norm": 0.7454545454545455, "acc_norm_stderr": 0.03401506715249039 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7575757575757576, "acc_stderr": 0.030532892233932022, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.030532892233932022 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8549222797927462, "acc_stderr": 0.02541634309630644, "acc_norm": 0.8549222797927462, "acc_norm_stderr": 0.02541634309630644 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5615384615384615, "acc_stderr": 0.02515826601686858, "acc_norm": 0.5615384615384615, "acc_norm_stderr": 0.02515826601686858 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.028493465091028593, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.028493465091028593 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6470588235294118, "acc_stderr": 0.031041941304059285, "acc_norm": 0.6470588235294118, "acc_norm_stderr": 0.031041941304059285 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8091743119266055, "acc_stderr": 0.01684767640009109, "acc_norm": 0.8091743119266055, "acc_norm_stderr": 0.01684767640009109 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.46296296296296297, "acc_stderr": 0.03400603625538271, "acc_norm": 0.46296296296296297, "acc_norm_stderr": 0.03400603625538271 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7647058823529411, "acc_stderr": 0.029771775228145628, "acc_norm": 0.7647058823529411, "acc_norm_stderr": 0.029771775228145628 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7468354430379747, "acc_stderr": 0.028304657943035303, "acc_norm": 0.7468354430379747, "acc_norm_stderr": 0.028304657943035303 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6188340807174888, "acc_stderr": 0.03259625118416827, "acc_norm": 0.6188340807174888, "acc_norm_stderr": 0.03259625118416827 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.732824427480916, "acc_stderr": 0.038808483010823944, "acc_norm": 0.732824427480916, "acc_norm_stderr": 0.038808483010823944 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8264462809917356, "acc_stderr": 0.0345727283691767, "acc_norm": 0.8264462809917356, "acc_norm_stderr": 0.0345727283691767 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.75, "acc_stderr": 0.04186091791394607, "acc_norm": 0.75, "acc_norm_stderr": 0.04186091791394607 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7423312883435583, "acc_stderr": 0.03436150827846917, "acc_norm": 0.7423312883435583, "acc_norm_stderr": 0.03436150827846917 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4732142857142857, "acc_stderr": 0.047389751192741546, "acc_norm": 0.4732142857142857, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.7475728155339806, "acc_stderr": 0.04301250399690879, "acc_norm": 0.7475728155339806, "acc_norm_stderr": 0.04301250399690879 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8547008547008547, "acc_stderr": 0.023086635086841407, "acc_norm": 0.8547008547008547, "acc_norm_stderr": 0.023086635086841407 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.68, "acc_stderr": 0.046882617226215034, "acc_norm": 0.68, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7803320561941252, "acc_stderr": 0.014805384478371153, "acc_norm": 0.7803320561941252, "acc_norm_stderr": 0.014805384478371153 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6734104046242775, "acc_stderr": 0.02524826477424284, "acc_norm": 0.6734104046242775, "acc_norm_stderr": 0.02524826477424284 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.34972067039106147, "acc_stderr": 0.015949308790233645, "acc_norm": 0.34972067039106147, "acc_norm_stderr": 0.015949308790233645 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.696078431372549, "acc_stderr": 0.026336613469046626, "acc_norm": 0.696078431372549, "acc_norm_stderr": 0.026336613469046626 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7041800643086816, "acc_stderr": 0.02592237178881877, "acc_norm": 0.7041800643086816, "acc_norm_stderr": 0.02592237178881877 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6944444444444444, "acc_stderr": 0.02563082497562135, "acc_norm": 0.6944444444444444, "acc_norm_stderr": 0.02563082497562135 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4645390070921986, "acc_stderr": 0.029752389657427047, "acc_norm": 0.4645390070921986, "acc_norm_stderr": 0.029752389657427047 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.44002607561929596, "acc_stderr": 0.012678037478574513, "acc_norm": 0.44002607561929596, "acc_norm_stderr": 0.012678037478574513 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6102941176470589, "acc_stderr": 0.029624663581159696, "acc_norm": 0.6102941176470589, "acc_norm_stderr": 0.029624663581159696 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6241830065359477, "acc_stderr": 0.01959402113657744, "acc_norm": 0.6241830065359477, "acc_norm_stderr": 0.01959402113657744 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7090909090909091, "acc_stderr": 0.04350271442923243, "acc_norm": 0.7090909090909091, "acc_norm_stderr": 0.04350271442923243 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7224489795918367, "acc_stderr": 0.02866685779027465, "acc_norm": 0.7224489795918367, "acc_norm_stderr": 0.02866685779027465 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8009950248756219, "acc_stderr": 0.028231365092758406, "acc_norm": 0.8009950248756219, "acc_norm_stderr": 0.028231365092758406 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.81, "acc_stderr": 0.03942772444036625, "acc_norm": 0.81, "acc_norm_stderr": 0.03942772444036625 }, "harness|hendrycksTest-virology|5": { "acc": 0.5, "acc_stderr": 0.03892494720807614, "acc_norm": 0.5, "acc_norm_stderr": 0.03892494720807614 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.029170885500727668, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.029170885500727668 }, "harness|truthfulqa:mc|0": { "mc1": 0.4773561811505508, "mc1_stderr": 0.01748554225848965, "mc2": 0.6348683354452056, "mc2_stderr": 0.015251462930296836 }, "harness|winogrande|5": { "acc": 0.7687450670876085, "acc_stderr": 0.011850040124850508 }, "harness|gsm8k|5": { "acc": 0.41698256254738436, "acc_stderr": 0.013581320997216588 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Radu1999__Mistral-Instruct-Ukrainian-slerp
[ "region:us" ]
2024-02-12T11:14:16+00:00
{"pretty_name": "Evaluation run of Radu1999/Mistral-Instruct-Ukrainian-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [Radu1999/Mistral-Instruct-Ukrainian-slerp](https://huggingface.co/Radu1999/Mistral-Instruct-Ukrainian-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Radu1999__Mistral-Instruct-Ukrainian-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T11:11:52.976201](https://huggingface.co/datasets/open-llm-leaderboard/details_Radu1999__Mistral-Instruct-Ukrainian-slerp/blob/main/results_2024-02-12T11-11-52.976201.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6128348134449864,\n \"acc_stderr\": 0.03306039267014507,\n \"acc_norm\": 0.6174798445939971,\n \"acc_norm_stderr\": 0.033726644979784004,\n \"mc1\": 0.4773561811505508,\n \"mc1_stderr\": 0.01748554225848965,\n \"mc2\": 0.6348683354452056,\n \"mc2_stderr\": 0.015251462930296836\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5767918088737202,\n \"acc_stderr\": 0.014438036220848029,\n \"acc_norm\": 0.6203071672354948,\n \"acc_norm_stderr\": 0.01418211986697487\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6528579964150567,\n \"acc_stderr\": 0.004750884401095161,\n \"acc_norm\": 0.8434574785899224,\n \"acc_norm_stderr\": 0.0036262628054422163\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5780346820809249,\n \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.5780346820809249,\n \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n \"acc_stderr\": 0.04615186962583703,\n \"acc_norm\": 0.40350877192982454,\n \"acc_norm_stderr\": 0.04615186962583703\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424649,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424649\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7032258064516129,\n \"acc_stderr\": 0.025988500792411898,\n \"acc_norm\": 0.7032258064516129,\n \"acc_norm_stderr\": 0.025988500792411898\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932022,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932022\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.02541634309630644,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.02541634309630644\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5615384615384615,\n \"acc_stderr\": 0.02515826601686858,\n \"acc_norm\": 0.5615384615384615,\n \"acc_norm_stderr\": 0.02515826601686858\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059285,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059285\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8091743119266055,\n \"acc_stderr\": 0.01684767640009109,\n \"acc_norm\": 0.8091743119266055,\n \"acc_norm_stderr\": 0.01684767640009109\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145628,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145628\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7468354430379747,\n \"acc_stderr\": 0.028304657943035303,\n \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.028304657943035303\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8264462809917356,\n \"acc_stderr\": 0.0345727283691767,\n \"acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.0345727283691767\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690879,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690879\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n \"acc_stderr\": 0.014805384478371153,\n \"acc_norm\": 0.7803320561941252,\n \"acc_norm_stderr\": 0.014805384478371153\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.02524826477424284,\n \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.02524826477424284\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34972067039106147,\n \"acc_stderr\": 0.015949308790233645,\n \"acc_norm\": 0.34972067039106147,\n \"acc_norm_stderr\": 0.015949308790233645\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046626,\n \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046626\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.02563082497562135,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.02563082497562135\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44002607561929596,\n \"acc_stderr\": 0.012678037478574513,\n \"acc_norm\": 0.44002607561929596,\n \"acc_norm_stderr\": 0.012678037478574513\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.029624663581159696,\n \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.029624663581159696\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.01959402113657744,\n \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.01959402113657744\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727668,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727668\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4773561811505508,\n \"mc1_stderr\": 0.01748554225848965,\n \"mc2\": 0.6348683354452056,\n \"mc2_stderr\": 0.015251462930296836\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7687450670876085,\n \"acc_stderr\": 0.011850040124850508\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.41698256254738436,\n \"acc_stderr\": 0.013581320997216588\n }\n}\n```", "repo_url": "https://huggingface.co/Radu1999/Mistral-Instruct-Ukrainian-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|arc:challenge|25_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|gsm8k|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hellaswag|10_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T11-11-52.976201.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["**/details_harness|winogrande|5_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T11-11-52.976201.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T11_11_52.976201", "path": ["results_2024-02-12T11-11-52.976201.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T11-11-52.976201.parquet"]}]}]}
2024-02-12T11:14:43+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Radu1999/Mistral-Instruct-Ukrainian-slerp Dataset automatically created during the evaluation run of model Radu1999/Mistral-Instruct-Ukrainian-slerp on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-12T11:11:52.976201(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Radu1999/Mistral-Instruct-Ukrainian-slerp\n\n\n\nDataset automatically created during the evaluation run of model Radu1999/Mistral-Instruct-Ukrainian-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T11:11:52.976201(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Radu1999/Mistral-Instruct-Ukrainian-slerp\n\n\n\nDataset automatically created during the evaluation run of model Radu1999/Mistral-Instruct-Ukrainian-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T11:11:52.976201(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
0cb7343e818cd2b67cbfd2fbe0fe435ec4a2ca78
--- <b>Word Sense Disambiguation Dataset in Historical Danish and Norwegian Texts </b> <b>Overview</b> This README provides information about the dataset for word sense disambiguation in historical texts, focusing on the transformation of the concept of fate ('skæbne') from its pre-modern to modern sense in the latter part of the 19th century. The dataset is established and annotated by a Danish-speaking literary scholar. <b>Task Description</b> The primary task addressed by this dataset is word sense disambiguation (WSD) in the context of historical texts. Specifically, the dataset explores the evolution of the concept of fate from its pre-modern, religiously and metaphysically inflected sense to a modern meaning that incorporates a secular and material understanding of the world. <b>Dataset Creation</b> The dataset is introduced through a novel annotation process carried out by one of the authors, a Danish-speaking literary scholar. The focus is on the semantic and ideological division of the concept of fate in the latter part of the 19th century. <b>Dataset Details</b> Size: The dataset consists of 104 segments in total, with 48 segments representing the pre-modern sense and 56 segments representing the modern notion of fate. Labeling: Segments are labeled as either pre-modern or modern based on predefined criteria. The labels generally correspond to the dichotomy between religious/metaphysical notions and secular/material notions. Content Focus: The dataset was produced with a specific focus on conceptual content. Note that the word 'skæbne' may not appear in every segment. Word Occurrence: The word 'skæbne' is present in 27 out of the 48 segments representing the pre-modern sense and in 24 out of the 56 segments representing the modern notion. Splitting: The dataset is split into training, development, and testing sets with proportions 80%, 10%, and 10%, respectively. Usage Researchers and practitioners interested in WSD in historical texts, particularly in the evolution of the concept of fate, can utilize this dataset for training, development, and evaluation purposes. <b>Citation</b>
MiMe-MeMo/MeMo-Dataset-WSD
[ "region:us" ]
2024-02-12T11:56:18+00:00
{}
2024-02-12T13:31:44+00:00
[]
[]
TAGS #region-us
--- <b>Word Sense Disambiguation Dataset in Historical Danish and Norwegian Texts </b> <b>Overview</b> This README provides information about the dataset for word sense disambiguation in historical texts, focusing on the transformation of the concept of fate ('skæbne') from its pre-modern to modern sense in the latter part of the 19th century. The dataset is established and annotated by a Danish-speaking literary scholar. <b>Task Description</b> The primary task addressed by this dataset is word sense disambiguation (WSD) in the context of historical texts. Specifically, the dataset explores the evolution of the concept of fate from its pre-modern, religiously and metaphysically inflected sense to a modern meaning that incorporates a secular and material understanding of the world. <b>Dataset Creation</b> The dataset is introduced through a novel annotation process carried out by one of the authors, a Danish-speaking literary scholar. The focus is on the semantic and ideological division of the concept of fate in the latter part of the 19th century. <b>Dataset Details</b> Size: The dataset consists of 104 segments in total, with 48 segments representing the pre-modern sense and 56 segments representing the modern notion of fate. Labeling: Segments are labeled as either pre-modern or modern based on predefined criteria. The labels generally correspond to the dichotomy between religious/metaphysical notions and secular/material notions. Content Focus: The dataset was produced with a specific focus on conceptual content. Note that the word 'skæbne' may not appear in every segment. Word Occurrence: The word 'skæbne' is present in 27 out of the 48 segments representing the pre-modern sense and in 24 out of the 56 segments representing the modern notion. Splitting: The dataset is split into training, development, and testing sets with proportions 80%, 10%, and 10%, respectively. Usage Researchers and practitioners interested in WSD in historical texts, particularly in the evolution of the concept of fate, can utilize this dataset for training, development, and evaluation purposes. <b>Citation</b>
[]
[ "TAGS\n#region-us \n" ]
02630c882e9bb43d63809ff11fb400e7d7727c35
# Dataset Card for Evaluation run of liminerity/Omningotex-7b-slerp <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [liminerity/Omningotex-7b-slerp](https://huggingface.co/liminerity/Omningotex-7b-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_liminerity__Omningotex-7b-slerp", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-12T12:08:20.064811](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Omningotex-7b-slerp/blob/main/results_2024-02-12T12-08-20.064811.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6532130250237066, "acc_stderr": 0.03195736938804737, "acc_norm": 0.6524731790497946, "acc_norm_stderr": 0.032625282824955575, "mc1": 0.620563035495716, "mc1_stderr": 0.01698703926614297, "mc2": 0.7632159986876987, "mc2_stderr": 0.014100501114162318 }, "harness|arc:challenge|25": { "acc": 0.7107508532423208, "acc_stderr": 0.013250012579393441, "acc_norm": 0.7329351535836177, "acc_norm_stderr": 0.01292893319649636 }, "harness|hellaswag|10": { "acc": 0.7145986855208126, "acc_stderr": 0.004506824094333298, "acc_norm": 0.8895638319059949, "acc_norm_stderr": 0.003127920738394107 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.04153948404742398, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.04153948404742398 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7105263157894737, "acc_stderr": 0.03690677986137283, "acc_norm": 0.7105263157894737, "acc_norm_stderr": 0.03690677986137283 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7018867924528301, "acc_stderr": 0.02815283794249387, "acc_norm": 0.7018867924528301, "acc_norm_stderr": 0.02815283794249387 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6589595375722543, "acc_stderr": 0.03614665424180826, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.03614665424180826 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.048971049527263666, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.048971049527263666 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.574468085106383, "acc_stderr": 0.03232146916224468, "acc_norm": 0.574468085106383, "acc_norm_stderr": 0.03232146916224468 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4649122807017544, "acc_stderr": 0.046920083813689104, "acc_norm": 0.4649122807017544, "acc_norm_stderr": 0.046920083813689104 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.40476190476190477, "acc_stderr": 0.025279850397404904, "acc_norm": 0.40476190476190477, "acc_norm_stderr": 0.025279850397404904 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.49206349206349204, "acc_stderr": 0.044715725362943486, "acc_norm": 0.49206349206349204, "acc_norm_stderr": 0.044715725362943486 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7838709677419354, "acc_stderr": 0.023415293433568525, "acc_norm": 0.7838709677419354, "acc_norm_stderr": 0.023415293433568525 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.72, "acc_stderr": 0.04512608598542127, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8080808080808081, "acc_stderr": 0.028057791672989017, "acc_norm": 0.8080808080808081, "acc_norm_stderr": 0.028057791672989017 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.917098445595855, "acc_stderr": 0.01989934131572178, "acc_norm": 0.917098445595855, "acc_norm_stderr": 0.01989934131572178 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.658974358974359, "acc_stderr": 0.02403548967633508, "acc_norm": 0.658974358974359, "acc_norm_stderr": 0.02403548967633508 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32592592592592595, "acc_stderr": 0.028578348365473082, "acc_norm": 0.32592592592592595, "acc_norm_stderr": 0.028578348365473082 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.030388353551886793, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.030388353551886793 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.36423841059602646, "acc_stderr": 0.03929111781242742, "acc_norm": 0.36423841059602646, "acc_norm_stderr": 0.03929111781242742 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8403669724770643, "acc_stderr": 0.015703498348461763, "acc_norm": 0.8403669724770643, "acc_norm_stderr": 0.015703498348461763 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5046296296296297, "acc_stderr": 0.03409825519163572, "acc_norm": 0.5046296296296297, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8480392156862745, "acc_stderr": 0.025195658428931792, "acc_norm": 0.8480392156862745, "acc_norm_stderr": 0.025195658428931792 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.025744902532290916, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.025744902532290916 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8091603053435115, "acc_stderr": 0.03446513350752598, "acc_norm": 0.8091603053435115, "acc_norm_stderr": 0.03446513350752598 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.03849856098794088, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.03849856098794088 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0401910747255735, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7852760736196319, "acc_stderr": 0.032262193772867744, "acc_norm": 0.7852760736196319, "acc_norm_stderr": 0.032262193772867744 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.41964285714285715, "acc_stderr": 0.04684099321077106, "acc_norm": 0.41964285714285715, "acc_norm_stderr": 0.04684099321077106 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.02093019318517933, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.02093019318517933 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8263090676883781, "acc_stderr": 0.01354741565866226, "acc_norm": 0.8263090676883781, "acc_norm_stderr": 0.01354741565866226 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7398843930635838, "acc_stderr": 0.023618678310069363, "acc_norm": 0.7398843930635838, "acc_norm_stderr": 0.023618678310069363 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4480446927374302, "acc_stderr": 0.016631976628930595, "acc_norm": 0.4480446927374302, "acc_norm_stderr": 0.016631976628930595 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7222222222222222, "acc_stderr": 0.0256468630971379, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.0256468630971379 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7106109324758842, "acc_stderr": 0.02575586592263295, "acc_norm": 0.7106109324758842, "acc_norm_stderr": 0.02575586592263295 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7376543209876543, "acc_stderr": 0.024477222856135114, "acc_norm": 0.7376543209876543, "acc_norm_stderr": 0.024477222856135114 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4858156028368794, "acc_stderr": 0.02981549448368206, "acc_norm": 0.4858156028368794, "acc_norm_stderr": 0.02981549448368206 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47131681877444587, "acc_stderr": 0.012749206007657473, "acc_norm": 0.47131681877444587, "acc_norm_stderr": 0.012749206007657473 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6838235294117647, "acc_stderr": 0.02824568739146292, "acc_norm": 0.6838235294117647, "acc_norm_stderr": 0.02824568739146292 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6715686274509803, "acc_stderr": 0.018999707383162673, "acc_norm": 0.6715686274509803, "acc_norm_stderr": 0.018999707383162673 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.044612721759105085, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.044612721759105085 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.726530612244898, "acc_stderr": 0.028535560337128448, "acc_norm": 0.726530612244898, "acc_norm_stderr": 0.028535560337128448 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8507462686567164, "acc_stderr": 0.02519692987482706, "acc_norm": 0.8507462686567164, "acc_norm_stderr": 0.02519692987482706 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774709, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774709 }, "harness|hendrycksTest-virology|5": { "acc": 0.5602409638554217, "acc_stderr": 0.03864139923699121, "acc_norm": 0.5602409638554217, "acc_norm_stderr": 0.03864139923699121 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.027966785859160893, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.027966785859160893 }, "harness|truthfulqa:mc|0": { "mc1": 0.620563035495716, "mc1_stderr": 0.01698703926614297, "mc2": 0.7632159986876987, "mc2_stderr": 0.014100501114162318 }, "harness|winogrande|5": { "acc": 0.8421468034727704, "acc_stderr": 0.010247165248719763 }, "harness|gsm8k|5": { "acc": 0.7050796057619408, "acc_stderr": 0.012560698010954767 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_liminerity__Omningotex-7b-slerp
[ "region:us" ]
2024-02-12T12:10:47+00:00
{"pretty_name": "Evaluation run of liminerity/Omningotex-7b-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [liminerity/Omningotex-7b-slerp](https://huggingface.co/liminerity/Omningotex-7b-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_liminerity__Omningotex-7b-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T12:08:20.064811](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__Omningotex-7b-slerp/blob/main/results_2024-02-12T12-08-20.064811.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6532130250237066,\n \"acc_stderr\": 0.03195736938804737,\n \"acc_norm\": 0.6524731790497946,\n \"acc_norm_stderr\": 0.032625282824955575,\n \"mc1\": 0.620563035495716,\n \"mc1_stderr\": 0.01698703926614297,\n \"mc2\": 0.7632159986876987,\n \"mc2_stderr\": 0.014100501114162318\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7107508532423208,\n \"acc_stderr\": 0.013250012579393441,\n \"acc_norm\": 0.7329351535836177,\n \"acc_norm_stderr\": 0.01292893319649636\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7145986855208126,\n \"acc_stderr\": 0.004506824094333298,\n \"acc_norm\": 0.8895638319059949,\n \"acc_norm_stderr\": 0.003127920738394107\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473082,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473082\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461763,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461763\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4480446927374302,\n \"acc_stderr\": 0.016631976628930595,\n \"acc_norm\": 0.4480446927374302,\n \"acc_norm_stderr\": 0.016631976628930595\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.02575586592263295,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.02575586592263295\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n \"acc_stderr\": 0.012749206007657473,\n \"acc_norm\": 0.47131681877444587,\n \"acc_norm_stderr\": 0.012749206007657473\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482706,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.620563035495716,\n \"mc1_stderr\": 0.01698703926614297,\n \"mc2\": 0.7632159986876987,\n \"mc2_stderr\": 0.014100501114162318\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8421468034727704,\n \"acc_stderr\": 0.010247165248719763\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7050796057619408,\n \"acc_stderr\": 0.012560698010954767\n }\n}\n```", "repo_url": "https://huggingface.co/liminerity/Omningotex-7b-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|arc:challenge|25_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|gsm8k|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hellaswag|10_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T12-08-20.064811.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["**/details_harness|winogrande|5_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T12-08-20.064811.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T12_08_20.064811", "path": ["results_2024-02-12T12-08-20.064811.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T12-08-20.064811.parquet"]}]}]}
2024-02-12T12:11:10+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of liminerity/Omningotex-7b-slerp Dataset automatically created during the evaluation run of model liminerity/Omningotex-7b-slerp on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-12T12:08:20.064811(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of liminerity/Omningotex-7b-slerp\n\n\n\nDataset automatically created during the evaluation run of model liminerity/Omningotex-7b-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T12:08:20.064811(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of liminerity/Omningotex-7b-slerp\n\n\n\nDataset automatically created during the evaluation run of model liminerity/Omningotex-7b-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T12:08:20.064811(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
77ca0a20794d4bb0c30668a0341b0953792d9960
# Georgian-Homonym-Disambiguation This repository contains all the datasets for the Georgian homonym disambiguation task. For more specific details you can read my <a href="https://github.com/ddamel/Georgian-Homonym-Disambiguation">article</a> ## Dataset At this point I've considered only the homonym: "ბარი" and it's different grammatical forms obtaining 7522 sentences. The "dataset.parquet" includes: - 763 sentences using "ბარი" as a "shovel" labaled with 0 - 1846 sentences using "ბარი" as a "lowland" labeld with 1 - 3320 sentences using "ბარი" as a "cafe" labeled with 2 - 1593 sentences where the homonym is used in a different context, labeled with 3 (Although these sentences could be further classified by the definitions of the homonyms, for this project I've ignored other usages). the column 'homonym_index' contains the index of the homonym in the sentence, that is, the index of the word which is the homonym. The "full-homonym-sentences-ბარ.txt" includes the sentences which contain the homonym "ბარი" and it's various grammatical forms. These sentences were limited to a maximum length of 13 words, with the homonym positioned in the middle of each sentence. They are around 28000 and are not labelled.
davmel/ka_homonym_disambiguation
[ "task_categories:text-classification", "size_categories:1M<n<10M", "language:ka", "license:mit", "region:us" ]
2024-02-12T12:11:43+00:00
{"language": ["ka"], "license": "mit", "size_categories": ["1M<n<10M"], "task_categories": ["text-classification"]}
2024-02-12T13:15:41+00:00
[]
[ "ka" ]
TAGS #task_categories-text-classification #size_categories-1M<n<10M #language-Georgian #license-mit #region-us
# Georgian-Homonym-Disambiguation This repository contains all the datasets for the Georgian homonym disambiguation task. For more specific details you can read my <a href="URL ## Dataset At this point I've considered only the homonym: "ბარი" and it's different grammatical forms obtaining 7522 sentences. The "dataset.parquet" includes: - 763 sentences using "ბარი" as a "shovel" labaled with 0 - 1846 sentences using "ბარი" as a "lowland" labeld with 1 - 3320 sentences using "ბარი" as a "cafe" labeled with 2 - 1593 sentences where the homonym is used in a different context, labeled with 3 (Although these sentences could be further classified by the definitions of the homonyms, for this project I've ignored other usages). the column 'homonym_index' contains the index of the homonym in the sentence, that is, the index of the word which is the homonym. The "full-homonym-sentences-ბარ.txt" includes the sentences which contain the homonym "ბარი" and it's various grammatical forms. These sentences were limited to a maximum length of 13 words, with the homonym positioned in the middle of each sentence. They are around 28000 and are not labelled.
[ "# Georgian-Homonym-Disambiguation\nThis repository contains all the datasets for the Georgian homonym disambiguation task.\n\nFor more specific details you can read my <a href=\"URL", "## Dataset\nAt this point I've considered only the homonym: \"ბარი\" and it's different grammatical forms obtaining 7522 sentences.\n\nThe \"dataset.parquet\" includes:\n\n- 763 sentences using \"ბარი\" as a \"shovel\" labaled with 0\n- 1846 sentences using \"ბარი\" as a \"lowland\" labeld with 1\n- 3320 sentences using \"ბარი\" as a \"cafe\" labeled with 2 \n- 1593 sentences where the homonym is used in a different context, labeled with 3 (Although these sentences could be further classified by the definitions of the homonyms, for this project I've ignored other usages).\n\nthe column 'homonym_index' contains the index of the homonym in the sentence, that is, the index of the word which is the homonym.\n\nThe \"full-homonym-sentences-ბარ.txt\" includes the sentences which contain the homonym \"ბარი\" and it's various grammatical forms. These sentences were limited to a\nmaximum length of 13 words, with the homonym positioned in the middle of each sentence. They are around 28000 and are not labelled." ]
[ "TAGS\n#task_categories-text-classification #size_categories-1M<n<10M #language-Georgian #license-mit #region-us \n", "# Georgian-Homonym-Disambiguation\nThis repository contains all the datasets for the Georgian homonym disambiguation task.\n\nFor more specific details you can read my <a href=\"URL", "## Dataset\nAt this point I've considered only the homonym: \"ბარი\" and it's different grammatical forms obtaining 7522 sentences.\n\nThe \"dataset.parquet\" includes:\n\n- 763 sentences using \"ბარი\" as a \"shovel\" labaled with 0\n- 1846 sentences using \"ბარი\" as a \"lowland\" labeld with 1\n- 3320 sentences using \"ბარი\" as a \"cafe\" labeled with 2 \n- 1593 sentences where the homonym is used in a different context, labeled with 3 (Although these sentences could be further classified by the definitions of the homonyms, for this project I've ignored other usages).\n\nthe column 'homonym_index' contains the index of the homonym in the sentence, that is, the index of the word which is the homonym.\n\nThe \"full-homonym-sentences-ბარ.txt\" includes the sentences which contain the homonym \"ბარი\" and it's various grammatical forms. These sentences were limited to a\nmaximum length of 13 words, with the homonym positioned in the middle of each sentence. They are around 28000 and are not labelled." ]
dc78c932ecc78341bd8e7923a6bbc6d71a267642
# Dataset Card for "test_dataset-0212" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
ouvic215/test_dataset-0212
[ "region:us" ]
2024-02-12T12:18:38+00:00
{"dataset_info": {"features": [{"name": "mask_image", "dtype": "image"}, {"name": "text", "dtype": "string"}, {"name": "image", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 147332332.0, "num_examples": 1588}], "download_size": 146499523, "dataset_size": 147332332.0}}
2024-02-12T12:19:05+00:00
[]
[]
TAGS #region-us
# Dataset Card for "test_dataset-0212" More Information needed
[ "# Dataset Card for \"test_dataset-0212\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"test_dataset-0212\"\n\nMore Information needed" ]
4962b8a7d5e3edb94339e0e0fc3835f07509b821
# Dataset Card for Evaluation run of paulml/NMTOB-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [paulml/NMTOB-7B](https://huggingface.co/paulml/NMTOB-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_paulml__NMTOB-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-12T12:41:06.200570](https://huggingface.co/datasets/open-llm-leaderboard/details_paulml__NMTOB-7B/blob/main/results_2024-02-12T12-41-06.200570.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6524767492630037, "acc_stderr": 0.03200904140407624, "acc_norm": 0.6518349230917634, "acc_norm_stderr": 0.032679616576566206, "mc1": 0.6034271725826194, "mc1_stderr": 0.017124930942023515, "mc2": 0.7506354467048914, "mc2_stderr": 0.01429535038329162 }, "harness|arc:challenge|25": { "acc": 0.7056313993174061, "acc_stderr": 0.013318528460539419, "acc_norm": 0.7303754266211604, "acc_norm_stderr": 0.012968040686869147 }, "harness|hellaswag|10": { "acc": 0.7153953395737901, "acc_stderr": 0.004503037601847085, "acc_norm": 0.8893646683927504, "acc_norm_stderr": 0.0031303894668331987 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6592592592592592, "acc_stderr": 0.040943762699967926, "acc_norm": 0.6592592592592592, "acc_norm_stderr": 0.040943762699967926 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7171052631578947, "acc_stderr": 0.03665349695640767, "acc_norm": 0.7171052631578947, "acc_norm_stderr": 0.03665349695640767 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7018867924528301, "acc_stderr": 0.028152837942493864, "acc_norm": 0.7018867924528301, "acc_norm_stderr": 0.028152837942493864 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.653179190751445, "acc_stderr": 0.036291466701596636, "acc_norm": 0.653179190751445, "acc_norm_stderr": 0.036291466701596636 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.048971049527263666, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.048971049527263666 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5659574468085107, "acc_stderr": 0.03240038086792747, "acc_norm": 0.5659574468085107, "acc_norm_stderr": 0.03240038086792747 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4649122807017544, "acc_stderr": 0.046920083813689104, "acc_norm": 0.4649122807017544, "acc_norm_stderr": 0.046920083813689104 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41005291005291006, "acc_stderr": 0.025331202438944433, "acc_norm": 0.41005291005291006, "acc_norm_stderr": 0.025331202438944433 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.49206349206349204, "acc_stderr": 0.044715725362943486, "acc_norm": 0.49206349206349204, "acc_norm_stderr": 0.044715725362943486 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7774193548387097, "acc_stderr": 0.023664216671642514, "acc_norm": 0.7774193548387097, "acc_norm_stderr": 0.023664216671642514 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5172413793103449, "acc_stderr": 0.035158955511656986, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7636363636363637, "acc_stderr": 0.03317505930009182, "acc_norm": 0.7636363636363637, "acc_norm_stderr": 0.03317505930009182 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8080808080808081, "acc_stderr": 0.028057791672989017, "acc_norm": 0.8080808080808081, "acc_norm_stderr": 0.028057791672989017 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.917098445595855, "acc_stderr": 0.01989934131572178, "acc_norm": 0.917098445595855, "acc_norm_stderr": 0.01989934131572178 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6564102564102564, "acc_stderr": 0.024078696580635477, "acc_norm": 0.6564102564102564, "acc_norm_stderr": 0.024078696580635477 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3111111111111111, "acc_stderr": 0.028226446749683512, "acc_norm": 0.3111111111111111, "acc_norm_stderr": 0.028226446749683512 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6680672268907563, "acc_stderr": 0.03058869701378364, "acc_norm": 0.6680672268907563, "acc_norm_stderr": 0.03058869701378364 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.37748344370860926, "acc_stderr": 0.03958027231121569, "acc_norm": 0.37748344370860926, "acc_norm_stderr": 0.03958027231121569 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8422018348623853, "acc_stderr": 0.01563002297009244, "acc_norm": 0.8422018348623853, "acc_norm_stderr": 0.01563002297009244 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5, "acc_stderr": 0.034099716973523674, "acc_norm": 0.5, "acc_norm_stderr": 0.034099716973523674 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.025845017986926924, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.025845017986926924 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.810126582278481, "acc_stderr": 0.02553010046023349, "acc_norm": 0.810126582278481, "acc_norm_stderr": 0.02553010046023349 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8091603053435115, "acc_stderr": 0.03446513350752598, "acc_norm": 0.8091603053435115, "acc_norm_stderr": 0.03446513350752598 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.03849856098794088, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.03849856098794088 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0401910747255735, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7852760736196319, "acc_stderr": 0.032262193772867744, "acc_norm": 0.7852760736196319, "acc_norm_stderr": 0.032262193772867744 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.41964285714285715, "acc_stderr": 0.04684099321077106, "acc_norm": 0.41964285714285715, "acc_norm_stderr": 0.04684099321077106 }, "harness|hendrycksTest-management|5": { "acc": 0.7766990291262136, "acc_stderr": 0.04123553189891431, "acc_norm": 0.7766990291262136, "acc_norm_stderr": 0.04123553189891431 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.02093019318517933, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.02093019318517933 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8237547892720306, "acc_stderr": 0.013625556907993466, "acc_norm": 0.8237547892720306, "acc_norm_stderr": 0.013625556907993466 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7341040462427746, "acc_stderr": 0.023786203255508287, "acc_norm": 0.7341040462427746, "acc_norm_stderr": 0.023786203255508287 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4446927374301676, "acc_stderr": 0.01661988198817702, "acc_norm": 0.4446927374301676, "acc_norm_stderr": 0.01661988198817702 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7287581699346405, "acc_stderr": 0.02545775669666788, "acc_norm": 0.7287581699346405, "acc_norm_stderr": 0.02545775669666788 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7041800643086816, "acc_stderr": 0.025922371788818763, "acc_norm": 0.7041800643086816, "acc_norm_stderr": 0.025922371788818763 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7407407407407407, "acc_stderr": 0.024383665531035457, "acc_norm": 0.7407407407407407, "acc_norm_stderr": 0.024383665531035457 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.49645390070921985, "acc_stderr": 0.02982674915328092, "acc_norm": 0.49645390070921985, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46936114732724904, "acc_stderr": 0.012746237711716634, "acc_norm": 0.46936114732724904, "acc_norm_stderr": 0.012746237711716634 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6801470588235294, "acc_stderr": 0.02833295951403121, "acc_norm": 0.6801470588235294, "acc_norm_stderr": 0.02833295951403121 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6650326797385621, "acc_stderr": 0.019094228167000328, "acc_norm": 0.6650326797385621, "acc_norm_stderr": 0.019094228167000328 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.044612721759105085, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.044612721759105085 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.028263889943784593, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.028263889943784593 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8308457711442786, "acc_stderr": 0.026508590656233268, "acc_norm": 0.8308457711442786, "acc_norm_stderr": 0.026508590656233268 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.03588702812826371, "acc_norm": 0.85, "acc_norm_stderr": 0.03588702812826371 }, "harness|hendrycksTest-virology|5": { "acc": 0.572289156626506, "acc_stderr": 0.038515976837185335, "acc_norm": 0.572289156626506, "acc_norm_stderr": 0.038515976837185335 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.6034271725826194, "mc1_stderr": 0.017124930942023515, "mc2": 0.7506354467048914, "mc2_stderr": 0.01429535038329162 }, "harness|winogrande|5": { "acc": 0.8516179952644041, "acc_stderr": 0.009990706005184138 }, "harness|gsm8k|5": { "acc": 0.6899166034874905, "acc_stderr": 0.01274030571737627 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_paulml__NMTOB-7B
[ "region:us" ]
2024-02-12T12:43:25+00:00
{"pretty_name": "Evaluation run of paulml/NMTOB-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [paulml/NMTOB-7B](https://huggingface.co/paulml/NMTOB-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_paulml__NMTOB-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T12:41:06.200570](https://huggingface.co/datasets/open-llm-leaderboard/details_paulml__NMTOB-7B/blob/main/results_2024-02-12T12-41-06.200570.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6524767492630037,\n \"acc_stderr\": 0.03200904140407624,\n \"acc_norm\": 0.6518349230917634,\n \"acc_norm_stderr\": 0.032679616576566206,\n \"mc1\": 0.6034271725826194,\n \"mc1_stderr\": 0.017124930942023515,\n \"mc2\": 0.7506354467048914,\n \"mc2_stderr\": 0.01429535038329162\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7056313993174061,\n \"acc_stderr\": 0.013318528460539419,\n \"acc_norm\": 0.7303754266211604,\n \"acc_norm_stderr\": 0.012968040686869147\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7153953395737901,\n \"acc_stderr\": 0.004503037601847085,\n \"acc_norm\": 0.8893646683927504,\n \"acc_norm_stderr\": 0.0031303894668331987\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944433,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944433\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642514,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642514\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926924,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926924\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993466,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993466\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508287,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508287\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4446927374301676,\n \"acc_stderr\": 0.01661988198817702,\n \"acc_norm\": 0.4446927374301676,\n \"acc_norm_stderr\": 0.01661988198817702\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035457,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035457\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000328,\n \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000328\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6034271725826194,\n \"mc1_stderr\": 0.017124930942023515,\n \"mc2\": 0.7506354467048914,\n \"mc2_stderr\": 0.01429535038329162\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8516179952644041,\n \"acc_stderr\": 0.009990706005184138\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6899166034874905,\n \"acc_stderr\": 0.01274030571737627\n }\n}\n```", "repo_url": "https://huggingface.co/paulml/NMTOB-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|arc:challenge|25_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|gsm8k|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hellaswag|10_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T12-41-06.200570.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["**/details_harness|winogrande|5_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T12-41-06.200570.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T12_41_06.200570", "path": ["results_2024-02-12T12-41-06.200570.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T12-41-06.200570.parquet"]}]}]}
2024-02-12T12:44:01+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of paulml/NMTOB-7B Dataset automatically created during the evaluation run of model paulml/NMTOB-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-12T12:41:06.200570(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of paulml/NMTOB-7B\n\n\n\nDataset automatically created during the evaluation run of model paulml/NMTOB-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T12:41:06.200570(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of paulml/NMTOB-7B\n\n\n\nDataset automatically created during the evaluation run of model paulml/NMTOB-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T12:41:06.200570(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
202553b1408ced82f7b52afb8f9d6e84a1c97167
# Dataset Card for "TestDataset-0212-1" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
ouvic215/TestDataset-0212-1
[ "region:us" ]
2024-02-12T12:55:50+00:00
{"dataset_info": {"features": [{"name": "mask_image", "dtype": "image"}, {"name": "text", "dtype": "string"}, {"name": "image", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 147332332.0, "num_examples": 1588}], "download_size": 146499523, "dataset_size": 147332332.0}}
2024-02-12T12:56:14+00:00
[]
[]
TAGS #region-us
# Dataset Card for "TestDataset-0212-1" More Information needed
[ "# Dataset Card for \"TestDataset-0212-1\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"TestDataset-0212-1\"\n\nMore Information needed" ]
0ab176a7cb47669c4737246cbcbc4f1b07d203a7
# Dataset Card for Evaluation run of paulml/DPOB-NMTOB-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [paulml/DPOB-NMTOB-7B](https://huggingface.co/paulml/DPOB-NMTOB-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_paulml__DPOB-NMTOB-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-12T13:03:38.568467](https://huggingface.co/datasets/open-llm-leaderboard/details_paulml__DPOB-NMTOB-7B/blob/main/results_2024-02-12T13-03-38.568467.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6530989447346045, "acc_stderr": 0.031982430662079626, "acc_norm": 0.652496047322301, "acc_norm_stderr": 0.032651854912140676, "mc1": 0.6046511627906976, "mc1_stderr": 0.017115815632418208, "mc2": 0.7507547852305267, "mc2_stderr": 0.014295572507171578 }, "harness|arc:challenge|25": { "acc": 0.7056313993174061, "acc_stderr": 0.013318528460539419, "acc_norm": 0.7312286689419796, "acc_norm_stderr": 0.012955065963710696 }, "harness|hellaswag|10": { "acc": 0.7152957578171679, "acc_stderr": 0.0045035118550500325, "acc_norm": 0.8894642501493726, "acc_norm_stderr": 0.003129155503881715 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6518518518518519, "acc_stderr": 0.041153246103369526, "acc_norm": 0.6518518518518519, "acc_norm_stderr": 0.041153246103369526 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7171052631578947, "acc_stderr": 0.03665349695640767, "acc_norm": 0.7171052631578947, "acc_norm_stderr": 0.03665349695640767 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7018867924528301, "acc_stderr": 0.028152837942493864, "acc_norm": 0.7018867924528301, "acc_norm_stderr": 0.028152837942493864 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7847222222222222, "acc_stderr": 0.03437079344106135, "acc_norm": 0.7847222222222222, "acc_norm_stderr": 0.03437079344106135 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6647398843930635, "acc_stderr": 0.03599586301247077, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.03599586301247077 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.048971049527263666, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.048971049527263666 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5702127659574469, "acc_stderr": 0.03236214467715564, "acc_norm": 0.5702127659574469, "acc_norm_stderr": 0.03236214467715564 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41005291005291006, "acc_stderr": 0.025331202438944433, "acc_norm": 0.41005291005291006, "acc_norm_stderr": 0.025331202438944433 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.48412698412698413, "acc_stderr": 0.04469881854072606, "acc_norm": 0.48412698412698413, "acc_norm_stderr": 0.04469881854072606 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7870967741935484, "acc_stderr": 0.02328766512726854, "acc_norm": 0.7870967741935484, "acc_norm_stderr": 0.02328766512726854 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7636363636363637, "acc_stderr": 0.03317505930009182, "acc_norm": 0.7636363636363637, "acc_norm_stderr": 0.03317505930009182 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8131313131313131, "acc_stderr": 0.027772533334218967, "acc_norm": 0.8131313131313131, "acc_norm_stderr": 0.027772533334218967 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.917098445595855, "acc_stderr": 0.01989934131572178, "acc_norm": 0.917098445595855, "acc_norm_stderr": 0.01989934131572178 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6564102564102564, "acc_stderr": 0.024078696580635477, "acc_norm": 0.6564102564102564, "acc_norm_stderr": 0.024078696580635477 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3148148148148148, "acc_stderr": 0.028317533496066485, "acc_norm": 0.3148148148148148, "acc_norm_stderr": 0.028317533496066485 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6638655462184874, "acc_stderr": 0.03068473711513537, "acc_norm": 0.6638655462184874, "acc_norm_stderr": 0.03068473711513537 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.37748344370860926, "acc_stderr": 0.03958027231121569, "acc_norm": 0.37748344370860926, "acc_norm_stderr": 0.03958027231121569 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8440366972477065, "acc_stderr": 0.01555580271359017, "acc_norm": 0.8440366972477065, "acc_norm_stderr": 0.01555580271359017 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.49537037037037035, "acc_stderr": 0.03409825519163572, "acc_norm": 0.49537037037037035, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.025845017986926924, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.025845017986926924 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.810126582278481, "acc_stderr": 0.02553010046023349, "acc_norm": 0.810126582278481, "acc_norm_stderr": 0.02553010046023349 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8091603053435115, "acc_stderr": 0.03446513350752598, "acc_norm": 0.8091603053435115, "acc_norm_stderr": 0.03446513350752598 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.03849856098794088, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.03849856098794088 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252627, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252627 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7852760736196319, "acc_stderr": 0.032262193772867744, "acc_norm": 0.7852760736196319, "acc_norm_stderr": 0.032262193772867744 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.41964285714285715, "acc_stderr": 0.04684099321077106, "acc_norm": 0.41964285714285715, "acc_norm_stderr": 0.04684099321077106 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.02093019318517933, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.02093019318517933 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8250319284802043, "acc_stderr": 0.013586619219903348, "acc_norm": 0.8250319284802043, "acc_norm_stderr": 0.013586619219903348 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7398843930635838, "acc_stderr": 0.023618678310069363, "acc_norm": 0.7398843930635838, "acc_norm_stderr": 0.023618678310069363 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4424581005586592, "acc_stderr": 0.016611393687268584, "acc_norm": 0.4424581005586592, "acc_norm_stderr": 0.016611393687268584 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7287581699346405, "acc_stderr": 0.02545775669666788, "acc_norm": 0.7287581699346405, "acc_norm_stderr": 0.02545775669666788 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7041800643086816, "acc_stderr": 0.025922371788818763, "acc_norm": 0.7041800643086816, "acc_norm_stderr": 0.025922371788818763 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7438271604938271, "acc_stderr": 0.0242885336377261, "acc_norm": 0.7438271604938271, "acc_norm_stderr": 0.0242885336377261 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5, "acc_stderr": 0.029827499313594685, "acc_norm": 0.5, "acc_norm_stderr": 0.029827499313594685 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4654498044328553, "acc_stderr": 0.012739711554045704, "acc_norm": 0.4654498044328553, "acc_norm_stderr": 0.012739711554045704 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6801470588235294, "acc_stderr": 0.02833295951403121, "acc_norm": 0.6801470588235294, "acc_norm_stderr": 0.02833295951403121 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6666666666666666, "acc_stderr": 0.0190709855896875, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.0190709855896875 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.044612721759105085, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.044612721759105085 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.028263889943784593, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.028263889943784593 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454115, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454115 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.03588702812826371, "acc_norm": 0.85, "acc_norm_stderr": 0.03588702812826371 }, "harness|hendrycksTest-virology|5": { "acc": 0.5602409638554217, "acc_stderr": 0.03864139923699121, "acc_norm": 0.5602409638554217, "acc_norm_stderr": 0.03864139923699121 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.6046511627906976, "mc1_stderr": 0.017115815632418208, "mc2": 0.7507547852305267, "mc2_stderr": 0.014295572507171578 }, "harness|winogrande|5": { "acc": 0.8516179952644041, "acc_stderr": 0.00999070600518414 }, "harness|gsm8k|5": { "acc": 0.6899166034874905, "acc_stderr": 0.01274030571737627 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_paulml__DPOB-NMTOB-7B
[ "region:us" ]
2024-02-12T13:05:59+00:00
{"pretty_name": "Evaluation run of paulml/DPOB-NMTOB-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [paulml/DPOB-NMTOB-7B](https://huggingface.co/paulml/DPOB-NMTOB-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_paulml__DPOB-NMTOB-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T13:03:38.568467](https://huggingface.co/datasets/open-llm-leaderboard/details_paulml__DPOB-NMTOB-7B/blob/main/results_2024-02-12T13-03-38.568467.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6530989447346045,\n \"acc_stderr\": 0.031982430662079626,\n \"acc_norm\": 0.652496047322301,\n \"acc_norm_stderr\": 0.032651854912140676,\n \"mc1\": 0.6046511627906976,\n \"mc1_stderr\": 0.017115815632418208,\n \"mc2\": 0.7507547852305267,\n \"mc2_stderr\": 0.014295572507171578\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7056313993174061,\n \"acc_stderr\": 0.013318528460539419,\n \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710696\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7152957578171679,\n \"acc_stderr\": 0.0045035118550500325,\n \"acc_norm\": 0.8894642501493726,\n \"acc_norm_stderr\": 0.003129155503881715\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944433,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944433\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.02328766512726854,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.02328766512726854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066485,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066485\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513537,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513537\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926924,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926924\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.013586619219903348,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.013586619219903348\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4424581005586592,\n \"acc_stderr\": 0.016611393687268584,\n \"acc_norm\": 0.4424581005586592,\n \"acc_norm_stderr\": 0.016611393687268584\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n \"acc_stderr\": 0.012739711554045704,\n \"acc_norm\": 0.4654498044328553,\n \"acc_norm_stderr\": 0.012739711554045704\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0190709855896875,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0190709855896875\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6046511627906976,\n \"mc1_stderr\": 0.017115815632418208,\n \"mc2\": 0.7507547852305267,\n \"mc2_stderr\": 0.014295572507171578\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8516179952644041,\n \"acc_stderr\": 0.00999070600518414\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6899166034874905,\n \"acc_stderr\": 0.01274030571737627\n }\n}\n```", "repo_url": "https://huggingface.co/paulml/DPOB-NMTOB-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|arc:challenge|25_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|gsm8k|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hellaswag|10_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T13-03-38.568467.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["**/details_harness|winogrande|5_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T13-03-38.568467.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T13_03_38.568467", "path": ["results_2024-02-12T13-03-38.568467.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T13-03-38.568467.parquet"]}]}]}
2024-02-12T13:06:23+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of paulml/DPOB-NMTOB-7B Dataset automatically created during the evaluation run of model paulml/DPOB-NMTOB-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-12T13:03:38.568467(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of paulml/DPOB-NMTOB-7B\n\n\n\nDataset automatically created during the evaluation run of model paulml/DPOB-NMTOB-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T13:03:38.568467(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of paulml/DPOB-NMTOB-7B\n\n\n\nDataset automatically created during the evaluation run of model paulml/DPOB-NMTOB-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T13:03:38.568467(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
1b2852faa6f37a3114d3d79930ba965f1e1f207f
# Dataset Card for "live_ATC_GTI8938" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
adityarra07/live_ATC_GTI8938
[ "region:us" ]
2024-02-12T13:22:32+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "audio", "dtype": {"audio": {"sampling_rate": 16000}}}], "splits": [{"name": "train", "num_bytes": 6097701.0, "num_examples": 6}], "download_size": 6084444, "dataset_size": 6097701.0}}
2024-02-12T13:22:35+00:00
[]
[]
TAGS #region-us
# Dataset Card for "live_ATC_GTI8938" More Information needed
[ "# Dataset Card for \"live_ATC_GTI8938\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"live_ATC_GTI8938\"\n\nMore Information needed" ]
a9fc470c4cd1c36d66a7e3422cc8ee60c71f4857
# Dataset Card for "live_ATC_KCKB" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
adityarra07/live_ATC_KCKB
[ "region:us" ]
2024-02-12T13:22:44+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "audio", "dtype": {"audio": {"sampling_rate": 16000}}}], "splits": [{"name": "train", "num_bytes": 2866257.0, "num_examples": 8}], "download_size": 1412026, "dataset_size": 2866257.0}}
2024-02-12T13:22:46+00:00
[]
[]
TAGS #region-us
# Dataset Card for "live_ATC_KCKB" More Information needed
[ "# Dataset Card for \"live_ATC_KCKB\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"live_ATC_KCKB\"\n\nMore Information needed" ]
3393ddadb4a5d6c5de2d622208ecc69b6905c26d
# Dataset This is a translated version of a subset from [OpenHermes](https://huggingface.co/datasets/teknium/openhermes). Coding tasks and word-play such as anagrams have been removed. It has been translated using [SeamlessM4T v2](https://huggingface.co/facebook/seamless-m4t-v2-large) T2T.
Mabeck/danish-OpenHermes
[ "task_categories:question-answering", "size_categories:100K<n<1M", "language:da", "license:mit", "region:us" ]
2024-02-12T13:22:59+00:00
{"language": ["da"], "license": "mit", "size_categories": ["100K<n<1M"], "task_categories": ["question-answering"]}
2024-02-14T21:48:47+00:00
[]
[ "da" ]
TAGS #task_categories-question-answering #size_categories-100K<n<1M #language-Danish #license-mit #region-us
# Dataset This is a translated version of a subset from OpenHermes. Coding tasks and word-play such as anagrams have been removed. It has been translated using SeamlessM4T v2 T2T.
[ "# Dataset\n\nThis is a translated version of a subset from OpenHermes. Coding tasks and word-play such as anagrams have been removed.\n\nIt has been translated using SeamlessM4T v2 T2T." ]
[ "TAGS\n#task_categories-question-answering #size_categories-100K<n<1M #language-Danish #license-mit #region-us \n", "# Dataset\n\nThis is a translated version of a subset from OpenHermes. Coding tasks and word-play such as anagrams have been removed.\n\nIt has been translated using SeamlessM4T v2 T2T." ]
a0ba3a26833916f1a51ece633ea2151b187f0825
# chatml-OpenHermes2.5-dpo-binarized-alpha This is a DPO dataset based on [argilla/OpenHermes2.5-dpo-binarized-alpha](https://huggingface.co/datasets/argilla/OpenHermes2.5-dpo-binarized-alpha). It implements the following features: * **ChatML format**: you can directly use this dataset without the need to apply a chat template. * **Filter out low scores**: removed samples with delta scores < 1 (530 in the training set, 66 in the test set). * **Curriculum learning**: sort the dataset by the 'delta_score' column in descending order. ## 💻 Code Code to reproduce this dataset: ```python !pip install -qqq datasets from transformers import AutoTokenizer from datasets import load_dataset import pandas as pd # Load the dataset dataset = load_dataset('argilla/OpenHermes2.5-dpo-binarized-alpha') def chatml_format(example): # Format instruction message = {"role": "user", "content": example['input']} prompt = tokenizer.apply_chat_template([message], tokenize=False, add_generation_prompt=True) # Format chosen answer chosen = tokenizer.apply_chat_template(example['chosen'], tokenize=False, add_generation_prompt=False)[len(prompt):] # Format rejected answer rejected = tokenizer.apply_chat_template(example['rejected'], tokenize=False, add_generation_prompt=False)[len(prompt):] # Calculate score difference delta_score = abs(example['rating'][0] - example['rating'][1]) return { "prompt": prompt, "chosen": chosen, "rejected": rejected, "delta_score": delta_score, } # Load tokenizer (chatml format) model_name = "mlabonne/NeuralHermes-2.5-Mistral-7B" tokenizer = AutoTokenizer.from_pretrained(model_name) tokenizer.pad_token = tokenizer.eos_token tokenizer.padding_side = "left" # Format dataset dataset_chatml = dataset.map( chatml_format, remove_columns=['input', 'chosen', 'rejected', 'conversations', 'generation_model', 'generation_prompt', 'raw_generation_responses', 'generations', 'views', 'system_prompt', 'model_name', 'language', 'id', 'hash', 'model', 'avatarUrl', 'custom_instruction', 'topic', 'title', 'idx', 'rejected_score', 'chosen_score'] ) # Remove low delta scores dataset_chatml = dataset_chatml.filter(lambda x: x["delta_score"] > 1.0) # Sort the dataset by the 'delta_score' column in descending order dataset_chatml = dataset_chatml.sort('delta_score', reverse=True) pd.DataFrame(dataset_chatml['train']).iloc[:10] ```
mlabonne/chatml-OpenHermes2.5-dpo-binarized-alpha
[ "region:us" ]
2024-02-12T13:30:48+00:00
{"dataset_info": {"features": [{"name": "category", "dtype": "string"}, {"name": "skip_prompt_formatting", "dtype": "bool"}, {"name": "source", "dtype": "string"}, {"name": "rating", "sequence": "float32"}, {"name": "chosen", "dtype": "string"}, {"name": "rejected", "dtype": "string"}, {"name": "chosen_model", "dtype": "string"}, {"name": "rejected_model", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "delta_score", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 23171896.91989107, "num_examples": 8283}, {"name": "test", "num_bytes": 2520804.53877551, "num_examples": 914}], "download_size": 14606963, "dataset_size": 25692701.45866658}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]}
2024-02-13T19:38:31+00:00
[]
[]
TAGS #region-us
# chatml-OpenHermes2.5-dpo-binarized-alpha This is a DPO dataset based on argilla/OpenHermes2.5-dpo-binarized-alpha. It implements the following features: * ChatML format: you can directly use this dataset without the need to apply a chat template. * Filter out low scores: removed samples with delta scores < 1 (530 in the training set, 66 in the test set). * Curriculum learning: sort the dataset by the 'delta_score' column in descending order. ## Code Code to reproduce this dataset:
[ "# chatml-OpenHermes2.5-dpo-binarized-alpha\n\nThis is a DPO dataset based on argilla/OpenHermes2.5-dpo-binarized-alpha. It implements the following features:\n\n* ChatML format: you can directly use this dataset without the need to apply a chat template.\n* Filter out low scores: removed samples with delta scores < 1 (530 in the training set, 66 in the test set).\n* Curriculum learning: sort the dataset by the 'delta_score' column in descending order.", "## Code\n\nCode to reproduce this dataset:" ]
[ "TAGS\n#region-us \n", "# chatml-OpenHermes2.5-dpo-binarized-alpha\n\nThis is a DPO dataset based on argilla/OpenHermes2.5-dpo-binarized-alpha. It implements the following features:\n\n* ChatML format: you can directly use this dataset without the need to apply a chat template.\n* Filter out low scores: removed samples with delta scores < 1 (530 in the training set, 66 in the test set).\n* Curriculum learning: sort the dataset by the 'delta_score' column in descending order.", "## Code\n\nCode to reproduce this dataset:" ]
b05dc97e84a25c8cd30943189515c26ac38cf211
2 DATASETS: 1) Generated sequences: generated_sequences/generated_dataset_corrected_index.fasta 2) Natural sequences:generated_sequences/natural_dataset.fasta RELATE REACTIONS AND SEQUENCES: In the natural sequences dataset, sequences can be related with the reaction used to generate them. The sequence tag indicates the SMILE of the sequence. In the generated sequences dataset, sequences can be related with the reaction used to generate them: * Example of one sequence tag of the generated dataset: holdout_9_reac_generated/holdout_0_15.743307701421433_497: This belongs to the reaction with index number 9 (index numbers and reactions are stored in the folder generated_sequences/index_reaction_numbers. PD: number 0 indicates the first sequence generated for this reaction, number 15.743307701421433 indicates perplexity and number 497 indicates the lenght. ADDITIONAL INFORMATION ABOUT THE DATASET GENERATION: Information of the datasets generated with the model for 4 different groups of reactions: * First_generation_seqeuences_and_reactions 1) MOST REPEATED REACTIONS: 100 sequences are generated for each of the 20 most repeated reactions in the training set (2,000 sequences in total) 2) LESS REPEATED REACTIONS: 100 sequences are generated for each of the 98 less repeated reactions in the training set (9,800 sequences in total) 3) MIDDLE REPEATED REACTIONS: 100 sequences are generated for each of the 40 middle repeated reactions in the training set (4,000 sequences in total) (the middle is calculated using the median) 4) HOLDOUT DATASET REACTIONS: 100 sequences are generated for each of the 40 reactions never seen in the training set (4,000 sequences in total) All those generated sequences are then filtered by pLDDT value > 70 (calculated using ESMFold)
nuriamimbreropelegri/generated_sequences
[ "region:us" ]
2024-02-12T13:37:36+00:00
{}
2024-02-15T21:31:30+00:00
[]
[]
TAGS #region-us
2 DATASETS: 1) Generated sequences: generated_sequences/generated_dataset_corrected_index.fasta 2) Natural sequences:generated_sequences/natural_dataset.fasta RELATE REACTIONS AND SEQUENCES: In the natural sequences dataset, sequences can be related with the reaction used to generate them. The sequence tag indicates the SMILE of the sequence. In the generated sequences dataset, sequences can be related with the reaction used to generate them: * Example of one sequence tag of the generated dataset: holdout_9_reac_generated/holdout_0_15.743307701421433_497: This belongs to the reaction with index number 9 (index numbers and reactions are stored in the folder generated_sequences/index_reaction_numbers. PD: number 0 indicates the first sequence generated for this reaction, number 15.743307701421433 indicates perplexity and number 497 indicates the lenght. ADDITIONAL INFORMATION ABOUT THE DATASET GENERATION: Information of the datasets generated with the model for 4 different groups of reactions: * First_generation_seqeuences_and_reactions 1) MOST REPEATED REACTIONS: 100 sequences are generated for each of the 20 most repeated reactions in the training set (2,000 sequences in total) 2) LESS REPEATED REACTIONS: 100 sequences are generated for each of the 98 less repeated reactions in the training set (9,800 sequences in total) 3) MIDDLE REPEATED REACTIONS: 100 sequences are generated for each of the 40 middle repeated reactions in the training set (4,000 sequences in total) (the middle is calculated using the median) 4) HOLDOUT DATASET REACTIONS: 100 sequences are generated for each of the 40 reactions never seen in the training set (4,000 sequences in total) All those generated sequences are then filtered by pLDDT value > 70 (calculated using ESMFold)
[]
[ "TAGS\n#region-us \n" ]
8f5d420ed6e01c80768d506dd9ccd423c58170a5
# Dataset Card for Evaluation run of eren23/merged-dpo-binarized-NeutrixOmnibe-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [eren23/merged-dpo-binarized-NeutrixOmnibe-7B](https://huggingface.co/eren23/merged-dpo-binarized-NeutrixOmnibe-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_eren23__merged-dpo-binarized-NeutrixOmnibe-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-12T13:35:30.829046](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__merged-dpo-binarized-NeutrixOmnibe-7B/blob/main/results_2024-02-12T13-35-30.829046.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6522917946377467, "acc_stderr": 0.032023116879680597, "acc_norm": 0.6514554167254718, "acc_norm_stderr": 0.03269668311247275, "mc1": 0.6242350061199511, "mc1_stderr": 0.01695458406021429, "mc2": 0.7690258519527344, "mc2_stderr": 0.01393738583634334 }, "harness|arc:challenge|25": { "acc": 0.7158703071672355, "acc_stderr": 0.013179442447653884, "acc_norm": 0.726962457337884, "acc_norm_stderr": 0.013019332762635751 }, "harness|hellaswag|10": { "acc": 0.7152957578171679, "acc_stderr": 0.0045035118550500325, "acc_norm": 0.8902609042023502, "acc_norm_stderr": 0.003119254828848945 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6444444444444445, "acc_stderr": 0.04135176749720385, "acc_norm": 0.6444444444444445, "acc_norm_stderr": 0.04135176749720385 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7105263157894737, "acc_stderr": 0.03690677986137283, "acc_norm": 0.7105263157894737, "acc_norm_stderr": 0.03690677986137283 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6981132075471698, "acc_stderr": 0.028254200344438662, "acc_norm": 0.6981132075471698, "acc_norm_stderr": 0.028254200344438662 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7638888888888888, "acc_stderr": 0.03551446610810826, "acc_norm": 0.7638888888888888, "acc_norm_stderr": 0.03551446610810826 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.653179190751445, "acc_stderr": 0.036291466701596636, "acc_norm": 0.653179190751445, "acc_norm_stderr": 0.036291466701596636 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.048971049527263666, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.048971049527263666 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.574468085106383, "acc_stderr": 0.03232146916224468, "acc_norm": 0.574468085106383, "acc_norm_stderr": 0.03232146916224468 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4649122807017544, "acc_stderr": 0.046920083813689104, "acc_norm": 0.4649122807017544, "acc_norm_stderr": 0.046920083813689104 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5448275862068965, "acc_stderr": 0.04149886942192117, "acc_norm": 0.5448275862068965, "acc_norm_stderr": 0.04149886942192117 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4126984126984127, "acc_stderr": 0.025355741263055273, "acc_norm": 0.4126984126984127, "acc_norm_stderr": 0.025355741263055273 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5, "acc_stderr": 0.04472135954999579, "acc_norm": 0.5, "acc_norm_stderr": 0.04472135954999579 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7838709677419354, "acc_stderr": 0.023415293433568525, "acc_norm": 0.7838709677419354, "acc_norm_stderr": 0.023415293433568525 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.803030303030303, "acc_stderr": 0.028335609732463362, "acc_norm": 0.803030303030303, "acc_norm_stderr": 0.028335609732463362 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9119170984455959, "acc_stderr": 0.02045374660160103, "acc_norm": 0.9119170984455959, "acc_norm_stderr": 0.02045374660160103 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6615384615384615, "acc_stderr": 0.023991500500313036, "acc_norm": 0.6615384615384615, "acc_norm_stderr": 0.023991500500313036 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.028493465091028593, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.028493465091028593 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.030388353551886797, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.030388353551886797 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3708609271523179, "acc_stderr": 0.03943966699183629, "acc_norm": 0.3708609271523179, "acc_norm_stderr": 0.03943966699183629 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8440366972477065, "acc_stderr": 0.01555580271359017, "acc_norm": 0.8440366972477065, "acc_norm_stderr": 0.01555580271359017 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5046296296296297, "acc_stderr": 0.03409825519163572, "acc_norm": 0.5046296296296297, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.02584501798692692, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.02584501798692692 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.810126582278481, "acc_stderr": 0.02553010046023349, "acc_norm": 0.810126582278481, "acc_norm_stderr": 0.02553010046023349 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8091603053435115, "acc_stderr": 0.03446513350752598, "acc_norm": 0.8091603053435115, "acc_norm_stderr": 0.03446513350752598 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7603305785123967, "acc_stderr": 0.03896878985070416, "acc_norm": 0.7603305785123967, "acc_norm_stderr": 0.03896878985070416 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252626, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252626 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7852760736196319, "acc_stderr": 0.032262193772867744, "acc_norm": 0.7852760736196319, "acc_norm_stderr": 0.032262193772867744 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04697113923010212, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04697113923010212 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406964, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406964 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8250319284802043, "acc_stderr": 0.013586619219903348, "acc_norm": 0.8250319284802043, "acc_norm_stderr": 0.013586619219903348 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7341040462427746, "acc_stderr": 0.02378620325550829, "acc_norm": 0.7341040462427746, "acc_norm_stderr": 0.02378620325550829 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.43910614525139663, "acc_stderr": 0.016598022120580428, "acc_norm": 0.43910614525139663, "acc_norm_stderr": 0.016598022120580428 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7287581699346405, "acc_stderr": 0.02545775669666788, "acc_norm": 0.7287581699346405, "acc_norm_stderr": 0.02545775669666788 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7138263665594855, "acc_stderr": 0.02567025924218893, "acc_norm": 0.7138263665594855, "acc_norm_stderr": 0.02567025924218893 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7376543209876543, "acc_stderr": 0.024477222856135114, "acc_norm": 0.7376543209876543, "acc_norm_stderr": 0.024477222856135114 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48936170212765956, "acc_stderr": 0.02982074719142248, "acc_norm": 0.48936170212765956, "acc_norm_stderr": 0.02982074719142248 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.470013037809648, "acc_stderr": 0.012747248967079067, "acc_norm": 0.470013037809648, "acc_norm_stderr": 0.012747248967079067 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6801470588235294, "acc_stderr": 0.02833295951403121, "acc_norm": 0.6801470588235294, "acc_norm_stderr": 0.02833295951403121 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6683006535947712, "acc_stderr": 0.01904748523936038, "acc_norm": 0.6683006535947712, "acc_norm_stderr": 0.01904748523936038 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.028263889943784593, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.028263889943784593 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.025870646766169136, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.025870646766169136 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.03588702812826371, "acc_norm": 0.85, "acc_norm_stderr": 0.03588702812826371 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.03869543323472101, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.6242350061199511, "mc1_stderr": 0.01695458406021429, "mc2": 0.7690258519527344, "mc2_stderr": 0.01393738583634334 }, "harness|winogrande|5": { "acc": 0.850828729281768, "acc_stderr": 0.010012598805627297 }, "harness|gsm8k|5": { "acc": 0.689158453373768, "acc_stderr": 0.012748860507777727 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_eren23__merged-dpo-binarized-NeutrixOmnibe-7B
[ "region:us" ]
2024-02-12T13:37:51+00:00
{"pretty_name": "Evaluation run of eren23/merged-dpo-binarized-NeutrixOmnibe-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [eren23/merged-dpo-binarized-NeutrixOmnibe-7B](https://huggingface.co/eren23/merged-dpo-binarized-NeutrixOmnibe-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_eren23__merged-dpo-binarized-NeutrixOmnibe-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T13:35:30.829046](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__merged-dpo-binarized-NeutrixOmnibe-7B/blob/main/results_2024-02-12T13-35-30.829046.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6522917946377467,\n \"acc_stderr\": 0.032023116879680597,\n \"acc_norm\": 0.6514554167254718,\n \"acc_norm_stderr\": 0.03269668311247275,\n \"mc1\": 0.6242350061199511,\n \"mc1_stderr\": 0.01695458406021429,\n \"mc2\": 0.7690258519527344,\n \"mc2_stderr\": 0.01393738583634334\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7158703071672355,\n \"acc_stderr\": 0.013179442447653884,\n \"acc_norm\": 0.726962457337884,\n \"acc_norm_stderr\": 0.013019332762635751\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7152957578171679,\n \"acc_stderr\": 0.0045035118550500325,\n \"acc_norm\": 0.8902609042023502,\n \"acc_norm_stderr\": 0.003119254828848945\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438662,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438662\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055273,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055273\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886797,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886797\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.013586619219903348,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.013586619219903348\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43910614525139663,\n \"acc_stderr\": 0.016598022120580428,\n \"acc_norm\": 0.43910614525139663,\n \"acc_norm_stderr\": 0.016598022120580428\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n \"acc_stderr\": 0.012747248967079067,\n \"acc_norm\": 0.470013037809648,\n \"acc_norm_stderr\": 0.012747248967079067\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6242350061199511,\n \"mc1_stderr\": 0.01695458406021429,\n \"mc2\": 0.7690258519527344,\n \"mc2_stderr\": 0.01393738583634334\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.850828729281768,\n \"acc_stderr\": 0.010012598805627297\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.689158453373768,\n \"acc_stderr\": 0.012748860507777727\n }\n}\n```", "repo_url": "https://huggingface.co/eren23/merged-dpo-binarized-NeutrixOmnibe-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|arc:challenge|25_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|gsm8k|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hellaswag|10_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T13-35-30.829046.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["**/details_harness|winogrande|5_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T13-35-30.829046.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T13_35_30.829046", "path": ["results_2024-02-12T13-35-30.829046.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T13-35-30.829046.parquet"]}]}]}
2024-02-12T13:38:19+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of eren23/merged-dpo-binarized-NeutrixOmnibe-7B Dataset automatically created during the evaluation run of model eren23/merged-dpo-binarized-NeutrixOmnibe-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-12T13:35:30.829046(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of eren23/merged-dpo-binarized-NeutrixOmnibe-7B\n\n\n\nDataset automatically created during the evaluation run of model eren23/merged-dpo-binarized-NeutrixOmnibe-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T13:35:30.829046(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of eren23/merged-dpo-binarized-NeutrixOmnibe-7B\n\n\n\nDataset automatically created during the evaluation run of model eren23/merged-dpo-binarized-NeutrixOmnibe-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T13:35:30.829046(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
eaaf2d9f4549a6e3f21d86af7e5c656e9e35d59d
# Dataset Card for Evaluation run of AbacusResearch/haLLAwa2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [AbacusResearch/haLLAwa2](https://huggingface.co/AbacusResearch/haLLAwa2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_AbacusResearch__haLLAwa2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-12T13:50:58.490257](https://huggingface.co/datasets/open-llm-leaderboard/details_AbacusResearch__haLLAwa2/blob/main/results_2024-02-12T13-50-58.490257.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6355767153439188, "acc_stderr": 0.032413752856157885, "acc_norm": 0.6387091168117495, "acc_norm_stderr": 0.03305418130027954, "mc1": 0.33047735618115054, "mc1_stderr": 0.016466769613698303, "mc2": 0.4737549402479496, "mc2_stderr": 0.015584581777910896 }, "harness|arc:challenge|25": { "acc": 0.6015358361774744, "acc_stderr": 0.014306946052735565, "acc_norm": 0.6331058020477816, "acc_norm_stderr": 0.014084133118104298 }, "harness|hellaswag|10": { "acc": 0.6836287592113125, "acc_stderr": 0.004641092001425291, "acc_norm": 0.8450507866958773, "acc_norm_stderr": 0.003611167302959773 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6074074074074074, "acc_stderr": 0.04218506215368881, "acc_norm": 0.6074074074074074, "acc_norm_stderr": 0.04218506215368881 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6710526315789473, "acc_stderr": 0.038234289699266046, "acc_norm": 0.6710526315789473, "acc_norm_stderr": 0.038234289699266046 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.690566037735849, "acc_stderr": 0.02845015479411864, "acc_norm": 0.690566037735849, "acc_norm_stderr": 0.02845015479411864 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7152777777777778, "acc_stderr": 0.037738099906869334, "acc_norm": 0.7152777777777778, "acc_norm_stderr": 0.037738099906869334 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.630057803468208, "acc_stderr": 0.0368122963339432, "acc_norm": 0.630057803468208, "acc_norm_stderr": 0.0368122963339432 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287533, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287533 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816506, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5787234042553191, "acc_stderr": 0.03227834510146268, "acc_norm": 0.5787234042553191, "acc_norm_stderr": 0.03227834510146268 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.47368421052631576, "acc_stderr": 0.046970851366478626, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.046970851366478626 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5241379310344828, "acc_stderr": 0.0416180850350153, "acc_norm": 0.5241379310344828, "acc_norm_stderr": 0.0416180850350153 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4021164021164021, "acc_stderr": 0.025253032554997695, "acc_norm": 0.4021164021164021, "acc_norm_stderr": 0.025253032554997695 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.40476190476190477, "acc_stderr": 0.04390259265377562, "acc_norm": 0.40476190476190477, "acc_norm_stderr": 0.04390259265377562 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7419354838709677, "acc_stderr": 0.024892469172462836, "acc_norm": 0.7419354838709677, "acc_norm_stderr": 0.024892469172462836 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.47783251231527096, "acc_stderr": 0.035145285621750094, "acc_norm": 0.47783251231527096, "acc_norm_stderr": 0.035145285621750094 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.67, "acc_stderr": 0.047258156262526066, "acc_norm": 0.67, "acc_norm_stderr": 0.047258156262526066 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03346409881055953, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7828282828282829, "acc_stderr": 0.029376616484945633, "acc_norm": 0.7828282828282829, "acc_norm_stderr": 0.029376616484945633 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8601036269430051, "acc_stderr": 0.025033870583015184, "acc_norm": 0.8601036269430051, "acc_norm_stderr": 0.025033870583015184 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6205128205128205, "acc_stderr": 0.024603626924097417, "acc_norm": 0.6205128205128205, "acc_norm_stderr": 0.024603626924097417 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.362962962962963, "acc_stderr": 0.029318203645206865, "acc_norm": 0.362962962962963, "acc_norm_stderr": 0.029318203645206865 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6470588235294118, "acc_stderr": 0.031041941304059288, "acc_norm": 0.6470588235294118, "acc_norm_stderr": 0.031041941304059288 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2913907284768212, "acc_stderr": 0.037101857261199946, "acc_norm": 0.2913907284768212, "acc_norm_stderr": 0.037101857261199946 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8311926605504587, "acc_stderr": 0.01606005626853035, "acc_norm": 0.8311926605504587, "acc_norm_stderr": 0.01606005626853035 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5231481481481481, "acc_stderr": 0.03406315360711507, "acc_norm": 0.5231481481481481, "acc_norm_stderr": 0.03406315360711507 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8137254901960784, "acc_stderr": 0.027325470966716312, "acc_norm": 0.8137254901960784, "acc_norm_stderr": 0.027325470966716312 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7974683544303798, "acc_stderr": 0.026160568246601453, "acc_norm": 0.7974683544303798, "acc_norm_stderr": 0.026160568246601453 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6995515695067265, "acc_stderr": 0.030769352008229143, "acc_norm": 0.6995515695067265, "acc_norm_stderr": 0.030769352008229143 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7557251908396947, "acc_stderr": 0.03768335959728742, "acc_norm": 0.7557251908396947, "acc_norm_stderr": 0.03768335959728742 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8099173553719008, "acc_stderr": 0.03581796951709282, "acc_norm": 0.8099173553719008, "acc_norm_stderr": 0.03581796951709282 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8148148148148148, "acc_stderr": 0.03755265865037181, "acc_norm": 0.8148148148148148, "acc_norm_stderr": 0.03755265865037181 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.48214285714285715, "acc_stderr": 0.047427623612430116, "acc_norm": 0.48214285714285715, "acc_norm_stderr": 0.047427623612430116 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.04058042015646034, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.04058042015646034 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406957, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406957 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.67, "acc_stderr": 0.047258156262526094, "acc_norm": 0.67, "acc_norm_stderr": 0.047258156262526094 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8135376756066411, "acc_stderr": 0.013927751372001512, "acc_norm": 0.8135376756066411, "acc_norm_stderr": 0.013927751372001512 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7196531791907514, "acc_stderr": 0.02418242749657761, "acc_norm": 0.7196531791907514, "acc_norm_stderr": 0.02418242749657761 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4223463687150838, "acc_stderr": 0.016519594275297114, "acc_norm": 0.4223463687150838, "acc_norm_stderr": 0.016519594275297114 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7450980392156863, "acc_stderr": 0.02495418432487991, "acc_norm": 0.7450980392156863, "acc_norm_stderr": 0.02495418432487991 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7363344051446945, "acc_stderr": 0.02502553850053234, "acc_norm": 0.7363344051446945, "acc_norm_stderr": 0.02502553850053234 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7067901234567902, "acc_stderr": 0.02532988817190092, "acc_norm": 0.7067901234567902, "acc_norm_stderr": 0.02532988817190092 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.450354609929078, "acc_stderr": 0.029680105565029036, "acc_norm": 0.450354609929078, "acc_norm_stderr": 0.029680105565029036 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4602346805736636, "acc_stderr": 0.01272978538659856, "acc_norm": 0.4602346805736636, "acc_norm_stderr": 0.01272978538659856 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6801470588235294, "acc_stderr": 0.02833295951403121, "acc_norm": 0.6801470588235294, "acc_norm_stderr": 0.02833295951403121 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6470588235294118, "acc_stderr": 0.019333142020797157, "acc_norm": 0.6470588235294118, "acc_norm_stderr": 0.019333142020797157 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.04607582090719976, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.04607582090719976 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.845771144278607, "acc_stderr": 0.025538433368578337, "acc_norm": 0.845771144278607, "acc_norm_stderr": 0.025538433368578337 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.88, "acc_stderr": 0.03265986323710906, "acc_norm": 0.88, "acc_norm_stderr": 0.03265986323710906 }, "harness|hendrycksTest-virology|5": { "acc": 0.5421686746987951, "acc_stderr": 0.03878626771002361, "acc_norm": 0.5421686746987951, "acc_norm_stderr": 0.03878626771002361 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.33047735618115054, "mc1_stderr": 0.016466769613698303, "mc2": 0.4737549402479496, "mc2_stderr": 0.015584581777910896 }, "harness|winogrande|5": { "acc": 0.7584846093133386, "acc_stderr": 0.012028983782011875 }, "harness|gsm8k|5": { "acc": 0.5208491281273692, "acc_stderr": 0.013760506094029868 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_AbacusResearch__haLLAwa2
[ "region:us" ]
2024-02-12T13:41:43+00:00
{"pretty_name": "Evaluation run of AbacusResearch/haLLAwa2", "dataset_summary": "Dataset automatically created during the evaluation run of model [AbacusResearch/haLLAwa2](https://huggingface.co/AbacusResearch/haLLAwa2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AbacusResearch__haLLAwa2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T13:50:58.490257](https://huggingface.co/datasets/open-llm-leaderboard/details_AbacusResearch__haLLAwa2/blob/main/results_2024-02-12T13-50-58.490257.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6355767153439188,\n \"acc_stderr\": 0.032413752856157885,\n \"acc_norm\": 0.6387091168117495,\n \"acc_norm_stderr\": 0.03305418130027954,\n \"mc1\": 0.33047735618115054,\n \"mc1_stderr\": 0.016466769613698303,\n \"mc2\": 0.4737549402479496,\n \"mc2_stderr\": 0.015584581777910896\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6015358361774744,\n \"acc_stderr\": 0.014306946052735565,\n \"acc_norm\": 0.6331058020477816,\n \"acc_norm_stderr\": 0.014084133118104298\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6836287592113125,\n \"acc_stderr\": 0.004641092001425291,\n \"acc_norm\": 0.8450507866958773,\n \"acc_norm_stderr\": 0.003611167302959773\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368881,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368881\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.02845015479411864,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.02845015479411864\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287533,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287533\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997695,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997695\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7419354838709677,\n \"acc_stderr\": 0.024892469172462836,\n \"acc_norm\": 0.7419354838709677,\n \"acc_norm_stderr\": 0.024892469172462836\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.024603626924097417,\n \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.024603626924097417\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.362962962962963,\n \"acc_stderr\": 0.029318203645206865,\n \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.029318203645206865\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059288,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059288\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8311926605504587,\n \"acc_stderr\": 0.01606005626853035,\n \"acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.01606005626853035\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601453,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601453\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728742,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728742\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n \"acc_stderr\": 0.013927751372001512,\n \"acc_norm\": 0.8135376756066411,\n \"acc_norm_stderr\": 0.013927751372001512\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657761,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657761\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4223463687150838,\n \"acc_stderr\": 0.016519594275297114,\n \"acc_norm\": 0.4223463687150838,\n \"acc_norm_stderr\": 0.016519594275297114\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7363344051446945,\n \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.7363344051446945,\n \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.02532988817190092,\n \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.02532988817190092\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4602346805736636,\n \"acc_stderr\": 0.01272978538659856,\n \"acc_norm\": 0.4602346805736636,\n \"acc_norm_stderr\": 0.01272978538659856\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.019333142020797157,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.019333142020797157\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.03878626771002361,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.03878626771002361\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33047735618115054,\n \"mc1_stderr\": 0.016466769613698303,\n \"mc2\": 0.4737549402479496,\n \"mc2_stderr\": 0.015584581777910896\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7584846093133386,\n \"acc_stderr\": 0.012028983782011875\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5208491281273692,\n \"acc_stderr\": 0.013760506094029868\n }\n}\n```", "repo_url": "https://huggingface.co/AbacusResearch/haLLAwa2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|arc:challenge|25_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|arc:challenge|25_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|gsm8k|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|gsm8k|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hellaswag|10_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hellaswag|10_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T13-39-22.814188.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T13-50-58.490257.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["**/details_harness|winogrande|5_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["**/details_harness|winogrande|5_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T13-50-58.490257.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T13_39_22.814188", "path": ["results_2024-02-12T13-39-22.814188.parquet"]}, {"split": "2024_02_12T13_50_58.490257", "path": ["results_2024-02-12T13-50-58.490257.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T13-50-58.490257.parquet"]}]}]}
2024-02-12T13:53:19+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of AbacusResearch/haLLAwa2 Dataset automatically created during the evaluation run of model AbacusResearch/haLLAwa2 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-12T13:50:58.490257(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of AbacusResearch/haLLAwa2\n\n\n\nDataset automatically created during the evaluation run of model AbacusResearch/haLLAwa2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T13:50:58.490257(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of AbacusResearch/haLLAwa2\n\n\n\nDataset automatically created during the evaluation run of model AbacusResearch/haLLAwa2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T13:50:58.490257(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
fea176fcd256f0a7921992c584a8a971addacf57
Dataset ini merupakan dataset yang diimpor dari OpenSLR untuk bahasa Jawa. Dataset ini bisa digunakan dalam beberapa tugas seperti Automatic Speech Recognition dan Text-to-speech. Terdapat sebanyak 5822 data yang terdiri atas vektor suara (speech), transkripsi (transcription), dan penanda unik (id). Hak cipta dataset dimiliki oleh perancang dataset (https://openslr.org/41/).
avalonai/openslr_javanese_5k
[ "task_categories:automatic-speech-recognition", "size_categories:1K<n<10K", "language:jv", "license:mit", "region:us" ]
2024-02-12T14:03:43+00:00
{"language": ["jv"], "license": "mit", "size_categories": ["1K<n<10K"], "task_categories": ["automatic-speech-recognition"], "dataset_info": {"features": [{"name": "speech", "sequence": "float32"}, {"name": "sentence", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 1611089713, "num_examples": 5822}], "download_size": 1615105742, "dataset_size": 1611089713}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-02-12T14:17:51+00:00
[]
[ "jv" ]
TAGS #task_categories-automatic-speech-recognition #size_categories-1K<n<10K #language-Javanese #license-mit #region-us
Dataset ini merupakan dataset yang diimpor dari OpenSLR untuk bahasa Jawa. Dataset ini bisa digunakan dalam beberapa tugas seperti Automatic Speech Recognition dan Text-to-speech. Terdapat sebanyak 5822 data yang terdiri atas vektor suara (speech), transkripsi (transcription), dan penanda unik (id). Hak cipta dataset dimiliki oleh perancang dataset (URL
[]
[ "TAGS\n#task_categories-automatic-speech-recognition #size_categories-1K<n<10K #language-Javanese #license-mit #region-us \n" ]
c96427a8929dedc6dec2f5fb1424be8cec5356bb
Altared version of : https://huggingface.co/datasets/lukecarlate/general_financial_news Removed None data type --- license: cc0-1.0 task_categories: - text-classification language: - en tags: - finance pretty_name: General Financial News Altared size_categories: - 10K<n<100K ---
KennNguyenDev/General_Financial_News_Altared
[ "region:us" ]
2024-02-12T14:07:01+00:00
{}
2024-02-12T14:08:35+00:00
[]
[]
TAGS #region-us
Altared version of : URL Removed None data type --- license: cc0-1.0 task_categories: - text-classification language: - en tags: - finance pretty_name: General Financial News Altared size_categories: - 10K<n<100K ---
[]
[ "TAGS\n#region-us \n" ]
fd01dbf0fa09c42ca8349ac17bf6518719d6320b
Dataset for "Fact-Aware Fake-news Classification for Indonesian Language"</br></br> Data originates from https://saberhoaks.jabarprov.go.id/v2/ ; https://opendata.jabarprov.go.id/id/dataset/ ; https://klinikhoaks.jatimprov.go.id/ </br> The attributes of data are: </br> 1. Label_id: Binary class labels ("HOAX"==1 ; "NON-HOAX"==0).</br> 2. Label: Binary class labels ("HOAX" or "NON-HOAX").</br> 3. Title: Claim or headline of news article.</br> 4. Content: the content of news article. </br> 5. Fact: The summary of factual evidence that is either supporting or contradicting the correponding claim.</br> 6. References: URL link of news article and the corresponding verdict or factual evidence as the justification of the news article.</br> 7. Classification: Fine-grained classification labels for the news article:</br> Class labels for saberhoax_data.csv: 'DISINFORMASI', ,'MISINFORMASI', 'FABRICATED CONTENT', 'FALSE CONNECTION', 'FALSE CONTEXT', 'IMPOSTER CONTENT', </br> 'MANIPULATED CONTENT', 'MISLEADING CONTENT', 'SATIRE OR PARODI', 'BENAR'.</br> Class labels for opendata_jabar.csv: 'BENAR', 'DISINFORMASI (HOAKS)', 'FABRICATED CONTENT', 'FALSE CONNECTION', 'FALSE CONTEXT', 'IMPOSTER CONTENT',</br> 'MANIPULATED CONTENT', 'MISINFORMASI (HOAKS)', 'MISLEADING CONTENT' </br> </br> Example of usage:</br> ```python >>> from datasets import load_dataset >>> train_dataset = load_dataset( ... "nlp-brin-id/id-hoax-report", ... split="train", ... keep_default_na=False, ... ).select_columns(['Label_id', 'Title', 'Content', 'Fact']) ```
nlp-brin-id/id-hoax-report
[ "task_categories:text-classification", "size_categories:1K<n<10K", "language:id", "license:apache-2.0", "region:us" ]
2024-02-12T14:22:13+00:00
{"language": ["id"], "license": "apache-2.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-classification"]}
2024-02-15T06:47:37+00:00
[]
[ "id" ]
TAGS #task_categories-text-classification #size_categories-1K<n<10K #language-Indonesian #license-apache-2.0 #region-us
Dataset for "Fact-Aware Fake-news Classification for Indonesian Language"</br></br> Data originates from URL ; URL ; URL </br> The attributes of data are: </br> 1. Label_id: Binary class labels ("HOAX"==1 ; "NON-HOAX"==0).</br> 2. Label: Binary class labels ("HOAX" or "NON-HOAX").</br> 3. Title: Claim or headline of news article.</br> 4. Content: the content of news article. </br> 5. Fact: The summary of factual evidence that is either supporting or contradicting the correponding claim.</br> 6. References: URL link of news article and the corresponding verdict or factual evidence as the justification of the news article.</br> 7. Classification: Fine-grained classification labels for the news article:</br> Class labels for saberhoax_data.csv: 'DISINFORMASI', ,'MISINFORMASI', 'FABRICATED CONTENT', 'FALSE CONNECTION', 'FALSE CONTEXT', 'IMPOSTER CONTENT', </br> 'MANIPULATED CONTENT', 'MISLEADING CONTENT', 'SATIRE OR PARODI', 'BENAR'.</br> Class labels for opendata_jabar.csv: 'BENAR', 'DISINFORMASI (HOAKS)', 'FABRICATED CONTENT', 'FALSE CONNECTION', 'FALSE CONTEXT', 'IMPOSTER CONTENT',</br> 'MANIPULATED CONTENT', 'MISINFORMASI (HOAKS)', 'MISLEADING CONTENT' </br> </br> Example of usage:</br>
[]
[ "TAGS\n#task_categories-text-classification #size_categories-1K<n<10K #language-Indonesian #license-apache-2.0 #region-us \n" ]
5bc678a9ac09028dcfe8b58441b5ac5c366d8eea
# Dataset Card for Evaluation run of Eurdem/Megatron-Mx <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Eurdem/Megatron-Mx](https://huggingface.co/Eurdem/Megatron-Mx) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Eurdem__Megatron-Mx", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-12T14:27:26.866749](https://huggingface.co/datasets/open-llm-leaderboard/details_Eurdem__Megatron-Mx/blob/main/results_2024-02-12T14-27-26.866749.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6226080527190941, "acc_stderr": 0.03275260029756631, "acc_norm": 0.625474906418769, "acc_norm_stderr": 0.033408847072192514, "mc1": 0.423500611995104, "mc1_stderr": 0.01729742144853473, "mc2": 0.5995420715043286, "mc2_stderr": 0.015268251321394645 }, "harness|arc:challenge|25": { "acc": 0.6245733788395904, "acc_stderr": 0.014150631435111725, "acc_norm": 0.6689419795221843, "acc_norm_stderr": 0.013752062419817832 }, "harness|hellaswag|10": { "acc": 0.6502688707428799, "acc_stderr": 0.004759103432380764, "acc_norm": 0.8498307110137423, "acc_norm_stderr": 0.0035650718701954473 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252606, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252606 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5925925925925926, "acc_stderr": 0.04244633238353228, "acc_norm": 0.5925925925925926, "acc_norm_stderr": 0.04244633238353228 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7171052631578947, "acc_stderr": 0.03665349695640767, "acc_norm": 0.7171052631578947, "acc_norm_stderr": 0.03665349695640767 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7018867924528301, "acc_stderr": 0.02815283794249387, "acc_norm": 0.7018867924528301, "acc_norm_stderr": 0.02815283794249387 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7222222222222222, "acc_stderr": 0.03745554791462456, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.03745554791462456 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.43, "acc_stderr": 0.04975698519562428, "acc_norm": 0.43, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.05016135580465919, "acc_norm": 0.53, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6069364161849711, "acc_stderr": 0.03724249595817731, "acc_norm": 0.6069364161849711, "acc_norm_stderr": 0.03724249595817731 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3627450980392157, "acc_stderr": 0.04784060704105653, "acc_norm": 0.3627450980392157, "acc_norm_stderr": 0.04784060704105653 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.76, "acc_stderr": 0.04292346959909283, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909283 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5361702127659574, "acc_stderr": 0.032600385118357715, "acc_norm": 0.5361702127659574, "acc_norm_stderr": 0.032600385118357715 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4298245614035088, "acc_stderr": 0.04657047260594963, "acc_norm": 0.4298245614035088, "acc_norm_stderr": 0.04657047260594963 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5793103448275863, "acc_stderr": 0.0411391498118926, "acc_norm": 0.5793103448275863, "acc_norm_stderr": 0.0411391498118926 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4312169312169312, "acc_stderr": 0.025506481698138208, "acc_norm": 0.4312169312169312, "acc_norm_stderr": 0.025506481698138208 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4523809523809524, "acc_stderr": 0.044518079590553275, "acc_norm": 0.4523809523809524, "acc_norm_stderr": 0.044518079590553275 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5774193548387097, "acc_stderr": 0.02810096472427264, "acc_norm": 0.5774193548387097, "acc_norm_stderr": 0.02810096472427264 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.458128078817734, "acc_stderr": 0.03505630140785741, "acc_norm": 0.458128078817734, "acc_norm_stderr": 0.03505630140785741 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.65, "acc_stderr": 0.04793724854411018, "acc_norm": 0.65, "acc_norm_stderr": 0.04793724854411018 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03346409881055953, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7727272727272727, "acc_stderr": 0.029857515673386424, "acc_norm": 0.7727272727272727, "acc_norm_stderr": 0.029857515673386424 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8341968911917098, "acc_stderr": 0.026839845022314415, "acc_norm": 0.8341968911917098, "acc_norm_stderr": 0.026839845022314415 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5974358974358974, "acc_stderr": 0.02486499515976775, "acc_norm": 0.5974358974358974, "acc_norm_stderr": 0.02486499515976775 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3148148148148148, "acc_stderr": 0.02831753349606648, "acc_norm": 0.3148148148148148, "acc_norm_stderr": 0.02831753349606648 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6554621848739496, "acc_stderr": 0.03086868260412162, "acc_norm": 0.6554621848739496, "acc_norm_stderr": 0.03086868260412162 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33112582781456956, "acc_stderr": 0.038425817186598696, "acc_norm": 0.33112582781456956, "acc_norm_stderr": 0.038425817186598696 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8128440366972477, "acc_stderr": 0.016722684526200148, "acc_norm": 0.8128440366972477, "acc_norm_stderr": 0.016722684526200148 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4722222222222222, "acc_stderr": 0.0340470532865388, "acc_norm": 0.4722222222222222, "acc_norm_stderr": 0.0340470532865388 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7843137254901961, "acc_stderr": 0.028867431449849303, "acc_norm": 0.7843137254901961, "acc_norm_stderr": 0.028867431449849303 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.025744902532290916, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.025744902532290916 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6681614349775785, "acc_stderr": 0.03160295143776679, "acc_norm": 0.6681614349775785, "acc_norm_stderr": 0.03160295143776679 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8181818181818182, "acc_stderr": 0.03520893951097652, "acc_norm": 0.8181818181818182, "acc_norm_stderr": 0.03520893951097652 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252627, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252627 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7361963190184049, "acc_stderr": 0.034624199316156234, "acc_norm": 0.7361963190184049, "acc_norm_stderr": 0.034624199316156234 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5089285714285714, "acc_stderr": 0.04745033255489123, "acc_norm": 0.5089285714285714, "acc_norm_stderr": 0.04745033255489123 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384495, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384495 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8632478632478633, "acc_stderr": 0.022509033937077802, "acc_norm": 0.8632478632478633, "acc_norm_stderr": 0.022509033937077802 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8148148148148148, "acc_stderr": 0.013890862162876166, "acc_norm": 0.8148148148148148, "acc_norm_stderr": 0.013890862162876166 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6936416184971098, "acc_stderr": 0.024818350129436593, "acc_norm": 0.6936416184971098, "acc_norm_stderr": 0.024818350129436593 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4, "acc_stderr": 0.01638463841038082, "acc_norm": 0.4, "acc_norm_stderr": 0.01638463841038082 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7091503267973857, "acc_stderr": 0.02600480036395213, "acc_norm": 0.7091503267973857, "acc_norm_stderr": 0.02600480036395213 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.707395498392283, "acc_stderr": 0.02583989833487798, "acc_norm": 0.707395498392283, "acc_norm_stderr": 0.02583989833487798 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7160493827160493, "acc_stderr": 0.025089478523765137, "acc_norm": 0.7160493827160493, "acc_norm_stderr": 0.025089478523765137 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.475177304964539, "acc_stderr": 0.029790719243829727, "acc_norm": 0.475177304964539, "acc_norm_stderr": 0.029790719243829727 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4576271186440678, "acc_stderr": 0.012724296550980188, "acc_norm": 0.4576271186440678, "acc_norm_stderr": 0.012724296550980188 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6470588235294118, "acc_stderr": 0.0290294228156814, "acc_norm": 0.6470588235294118, "acc_norm_stderr": 0.0290294228156814 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6601307189542484, "acc_stderr": 0.019162418588623553, "acc_norm": 0.6601307189542484, "acc_norm_stderr": 0.019162418588623553 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.044612721759105085, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.044612721759105085 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.02826388994378459, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.02826388994378459 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6169154228855721, "acc_stderr": 0.03437519337338251, "acc_norm": 0.6169154228855721, "acc_norm_stderr": 0.03437519337338251 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.03775251680686371, "acc_norm": 0.83, "acc_norm_stderr": 0.03775251680686371 }, "harness|hendrycksTest-virology|5": { "acc": 0.5060240963855421, "acc_stderr": 0.03892212195333045, "acc_norm": 0.5060240963855421, "acc_norm_stderr": 0.03892212195333045 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.423500611995104, "mc1_stderr": 0.01729742144853473, "mc2": 0.5995420715043286, "mc2_stderr": 0.015268251321394645 }, "harness|winogrande|5": { "acc": 0.7900552486187845, "acc_stderr": 0.01144628062926263 }, "harness|gsm8k|5": { "acc": 0.5299469294920395, "acc_stderr": 0.013747759685444704 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Eurdem__Megatron-Mx
[ "region:us" ]
2024-02-12T14:29:44+00:00
{"pretty_name": "Evaluation run of Eurdem/Megatron-Mx", "dataset_summary": "Dataset automatically created during the evaluation run of model [Eurdem/Megatron-Mx](https://huggingface.co/Eurdem/Megatron-Mx) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Eurdem__Megatron-Mx\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T14:27:26.866749](https://huggingface.co/datasets/open-llm-leaderboard/details_Eurdem__Megatron-Mx/blob/main/results_2024-02-12T14-27-26.866749.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6226080527190941,\n \"acc_stderr\": 0.03275260029756631,\n \"acc_norm\": 0.625474906418769,\n \"acc_norm_stderr\": 0.033408847072192514,\n \"mc1\": 0.423500611995104,\n \"mc1_stderr\": 0.01729742144853473,\n \"mc2\": 0.5995420715043286,\n \"mc2_stderr\": 0.015268251321394645\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6245733788395904,\n \"acc_stderr\": 0.014150631435111725,\n \"acc_norm\": 0.6689419795221843,\n \"acc_norm_stderr\": 0.013752062419817832\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6502688707428799,\n \"acc_stderr\": 0.004759103432380764,\n \"acc_norm\": 0.8498307110137423,\n \"acc_norm_stderr\": 0.0035650718701954473\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.03745554791462456,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03745554791462456\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.03724249595817731,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.03724249595817731\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138208,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138208\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5774193548387097,\n \"acc_stderr\": 0.02810096472427264,\n \"acc_norm\": 0.5774193548387097,\n \"acc_norm_stderr\": 0.02810096472427264\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411018,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.04793724854411018\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386424,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386424\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5974358974358974,\n \"acc_stderr\": 0.02486499515976775,\n \"acc_norm\": 0.5974358974358974,\n \"acc_norm_stderr\": 0.02486499515976775\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606648,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606648\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.03086868260412162,\n \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.03086868260412162\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8128440366972477,\n \"acc_stderr\": 0.016722684526200148,\n \"acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.016722684526200148\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849303,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849303\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097652,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097652\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.034624199316156234,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.034624199316156234\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.013890862162876166,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.013890862162876166\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.01638463841038082,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.01638463841038082\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765137,\n \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765137\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829727,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829727\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4576271186440678,\n \"acc_stderr\": 0.012724296550980188,\n \"acc_norm\": 0.4576271186440678,\n \"acc_norm_stderr\": 0.012724296550980188\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.0290294228156814,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.0290294228156814\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.019162418588623553,\n \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.019162418588623553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.02826388994378459,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.02826388994378459\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6169154228855721,\n \"acc_stderr\": 0.03437519337338251,\n \"acc_norm\": 0.6169154228855721,\n \"acc_norm_stderr\": 0.03437519337338251\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.423500611995104,\n \"mc1_stderr\": 0.01729742144853473,\n \"mc2\": 0.5995420715043286,\n \"mc2_stderr\": 0.015268251321394645\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7900552486187845,\n \"acc_stderr\": 0.01144628062926263\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5299469294920395,\n \"acc_stderr\": 0.013747759685444704\n }\n}\n```", "repo_url": "https://huggingface.co/Eurdem/Megatron-Mx", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|arc:challenge|25_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|gsm8k|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hellaswag|10_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T14-27-26.866749.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["**/details_harness|winogrande|5_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T14-27-26.866749.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T14_27_26.866749", "path": ["results_2024-02-12T14-27-26.866749.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T14-27-26.866749.parquet"]}]}]}
2024-02-12T14:30:17+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Eurdem/Megatron-Mx Dataset automatically created during the evaluation run of model Eurdem/Megatron-Mx on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-12T14:27:26.866749(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Eurdem/Megatron-Mx\n\n\n\nDataset automatically created during the evaluation run of model Eurdem/Megatron-Mx on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T14:27:26.866749(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Eurdem/Megatron-Mx\n\n\n\nDataset automatically created during the evaluation run of model Eurdem/Megatron-Mx on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T14:27:26.866749(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
06fbff4e60bbdd85686065187f109276d868cac3
Just the test split of [minipile](https://huggingface.co/datasets/JeanKaddour/minipile) in .jsonl.zst format for loading into preexisting piplines (ie lm-eval)
monology/minipile-test
[ "region:us" ]
2024-02-12T14:36:49+00:00
{}
2024-02-12T14:44:57+00:00
[]
[]
TAGS #region-us
Just the test split of minipile in .URL format for loading into preexisting piplines (ie lm-eval)
[]
[ "TAGS\n#region-us \n" ]
bff071326c3a595cee2a592b970c12a3adbb07b2
# Dataset Card for "arabic_punctuation" ## Dataset Details ### Dataset Description This is a curated dataset, specifically designed to facilitate the study of punctuation. It has undergone rigorous manual annotation and verification on the basis of sentence structure, with sentence boundaries clearly marked. The dataset is in three folders: 1. The ABC component of the Arabic Punctuation Dataset: This folder features the manually annotated punctuation gold standard. It consists of one chapter extracted from each of 45 non-fiction books by 36 authors from 19 different fields of study. It contains 45 text files with a total of 149K tokens in 13K sentences. 2. The CBT component: This folder has 1085 text files in 60 sub-folders, the full text of complete book translations that had been rendered from English into Arabic independently of this project. Their punctuation, we found out, mirrors the English source language texts; i.e., the sentence terminals in these Arabic texts follow the rules of English. In this folder are close to 3M words in more than 170K properly punctuated sentences. 3. The SSAC-UNPC component: This folder constitutes the third part of the Arabic Punctuation Dataset. It has close to 12M disconnected, disordered, complete sentences in 79 text files. These scrambled sentences were extracted from the predominantly legal Arabic subcorpus of the United Nations Parallel Corpus (UNPC). The punctuation here is authentic. It was done by the UN translators as part of their work. We consider this to be an excellent punctuation corpus because it mirrors the rule-governed punctuation of the English source documents, especially in relation to sentence terminals. These scrambled sentences total more than 309M words. ### Steps to reproduce The ABC component was manually annotated and verified. The CBT dataset was translated books extracted from an online library. The SSAC-UNPC dataset was full sentences extracted from the Arabic component of the United Nations Parallel Corpus. ## Citation ``` @misc{Yagi_Ashraf Elnagar_2024, url={https://data.mendeley.com/datasets/2pkxckwgs3/1}, journal={Arabic Punctuation Dataset}, publisher={Mendeley Data}, author={Yagi, Sane and Ashraf Elnagar}, year={2024}, month={Jan}} ```
asas-ai/arabic_punctuation
[ "license:cc-by-4.0", "region:us" ]
2024-02-12T14:41:29+00:00
{"license": "cc-by-4.0", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "dataset_name", "dtype": "string"}, {"name": "subset_name", "dtype": "string"}, {"name": "text_no_punc", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7357785049, "num_examples": 11738819}], "download_size": 3092363938, "dataset_size": 7357785049}}
2024-02-12T15:26:28+00:00
[]
[]
TAGS #license-cc-by-4.0 #region-us
# Dataset Card for "arabic_punctuation" ## Dataset Details ### Dataset Description This is a curated dataset, specifically designed to facilitate the study of punctuation. It has undergone rigorous manual annotation and verification on the basis of sentence structure, with sentence boundaries clearly marked. The dataset is in three folders: 1. The ABC component of the Arabic Punctuation Dataset: This folder features the manually annotated punctuation gold standard. It consists of one chapter extracted from each of 45 non-fiction books by 36 authors from 19 different fields of study. It contains 45 text files with a total of 149K tokens in 13K sentences. 2. The CBT component: This folder has 1085 text files in 60 sub-folders, the full text of complete book translations that had been rendered from English into Arabic independently of this project. Their punctuation, we found out, mirrors the English source language texts; i.e., the sentence terminals in these Arabic texts follow the rules of English. In this folder are close to 3M words in more than 170K properly punctuated sentences. 3. The SSAC-UNPC component: This folder constitutes the third part of the Arabic Punctuation Dataset. It has close to 12M disconnected, disordered, complete sentences in 79 text files. These scrambled sentences were extracted from the predominantly legal Arabic subcorpus of the United Nations Parallel Corpus (UNPC). The punctuation here is authentic. It was done by the UN translators as part of their work. We consider this to be an excellent punctuation corpus because it mirrors the rule-governed punctuation of the English source documents, especially in relation to sentence terminals. These scrambled sentences total more than 309M words. ### Steps to reproduce The ABC component was manually annotated and verified. The CBT dataset was translated books extracted from an online library. The SSAC-UNPC dataset was full sentences extracted from the Arabic component of the United Nations Parallel Corpus.
[ "# Dataset Card for \"arabic_punctuation\"", "## Dataset Details", "### Dataset Description\n\n\nThis is a curated dataset, specifically designed to facilitate the study of punctuation. It has undergone rigorous manual annotation and verification on the basis of sentence structure, with sentence boundaries clearly marked. The dataset is in three folders:\n\n1. The ABC component of the Arabic Punctuation Dataset: This folder features the manually annotated punctuation gold standard. It consists of one chapter extracted from each of 45 non-fiction books by 36 authors from 19 different fields of study. It contains 45 text files with a total of 149K tokens in 13K sentences. \n\n2. The CBT component: This folder has 1085 text files in 60 sub-folders, the full text of complete book translations that had been rendered from English into Arabic independently of this project. Their punctuation, we found out, mirrors the English source language texts; i.e., the sentence terminals in these Arabic texts follow the rules of English. In this folder are close to 3M words in more than 170K properly punctuated sentences.\n\n3. The SSAC-UNPC component: This folder constitutes the third part of the Arabic Punctuation Dataset. It has close to 12M disconnected, disordered, complete sentences in 79 text files. These scrambled sentences were extracted from the predominantly legal Arabic subcorpus of the United Nations Parallel Corpus (UNPC). The punctuation here is authentic. It was done by the UN translators as part of their work. We consider this to be an excellent punctuation corpus because it mirrors the rule-governed punctuation of the English source documents, especially in relation to sentence terminals. These scrambled sentences total more than 309M words.", "### Steps to reproduce\n\nThe ABC component was manually annotated and verified.\nThe CBT dataset was translated books extracted from an online library.\nThe SSAC-UNPC dataset was full sentences extracted from the Arabic component of the United Nations Parallel Corpus." ]
[ "TAGS\n#license-cc-by-4.0 #region-us \n", "# Dataset Card for \"arabic_punctuation\"", "## Dataset Details", "### Dataset Description\n\n\nThis is a curated dataset, specifically designed to facilitate the study of punctuation. It has undergone rigorous manual annotation and verification on the basis of sentence structure, with sentence boundaries clearly marked. The dataset is in three folders:\n\n1. The ABC component of the Arabic Punctuation Dataset: This folder features the manually annotated punctuation gold standard. It consists of one chapter extracted from each of 45 non-fiction books by 36 authors from 19 different fields of study. It contains 45 text files with a total of 149K tokens in 13K sentences. \n\n2. The CBT component: This folder has 1085 text files in 60 sub-folders, the full text of complete book translations that had been rendered from English into Arabic independently of this project. Their punctuation, we found out, mirrors the English source language texts; i.e., the sentence terminals in these Arabic texts follow the rules of English. In this folder are close to 3M words in more than 170K properly punctuated sentences.\n\n3. The SSAC-UNPC component: This folder constitutes the third part of the Arabic Punctuation Dataset. It has close to 12M disconnected, disordered, complete sentences in 79 text files. These scrambled sentences were extracted from the predominantly legal Arabic subcorpus of the United Nations Parallel Corpus (UNPC). The punctuation here is authentic. It was done by the UN translators as part of their work. We consider this to be an excellent punctuation corpus because it mirrors the rule-governed punctuation of the English source documents, especially in relation to sentence terminals. These scrambled sentences total more than 309M words.", "### Steps to reproduce\n\nThe ABC component was manually annotated and verified.\nThe CBT dataset was translated books extracted from an online library.\nThe SSAC-UNPC dataset was full sentences extracted from the Arabic component of the United Nations Parallel Corpus." ]
bdafcfc51eaae88bbf919e5314410124688f155c
# Dataset Card for "Semantic Relations Extraction" ## Dataset Description ### Repository The "Semantic Relations Extraction" dataset is hosted on the Hugging Face platform, and was created with code from this [GitHub repository](https://github.com/DehydratedWater/qlora_semantic_extraction). ### Purpose The "Semantic Relations Extraction" dataset was created for the purpose of fine-tuning smaller LLama2 (7B) models to speed up and reduce the costs of extracting semantic relations between entities in texts. This repository is part of a larger project aimed at creating a low-cost system for preprocessing documents in order to build a knowledge graph used for question answering and automated alerting. ### Data Sources The dataset was built using the following source: - [`datasets/scientific_papers`](https://huggingface.co/datasets/scientific_papers) ### Files in the Dataset The repository contains 4 files: 1. `extracted_relations.csv` -> A dataset of generated relations between entities containing columns for [`summary`, `article part`, `output json`, `database`, `abstract`, `list_of_contents`]. 2. `core_extracted_relations.csv` -> The same dataset but without the original abstracts and lists_of_contents. It contains columns for [`summary`, `article part`, `output json`]. 3. `llama2_prompts.csv` -> Multiple variants of the prompt with a response that can be used for fine-tuning the model. It is created by concatenating data in `core_extracted_relations.csv`. 4. `synthetic_data_12_02_24-full.dump` -> A backup of the whole PostgreSQL database used during data generation. It is the source for all the other files, exported by the `airflow` user in custom format, with compression level 6 and UTF-8 encoding. ### Database Schema The dataset includes a database schema illustration, which provides an overview of how the data is organized within the database. ![Database Schema](https://huggingface.co/datasets/DehydratedWater42/semantic_relations_extraction/resolve/main/database_diagram.png) ### GitHub Repository Synthetic data was generated using an Airflow data pipeline. The entire codebase can be accessed in this [GitHub repository](https://github.com/DehydratedWater/qlora_semantic_extraction). ### Generation Process This data was generated based on the `datasets/scientific_papers` dataset. This dataset contains a list of scientific articles with separate `abstracts` and `lists of contents`. Here is the synthetic data generation overview: 1. All the `abstracts` and `lists of contents` were inserted into the database. 2. The main content of every article was split into overlapping segments of 1k LLaMA tokens with a 200-token overlap. 3. 10k of the `abstracts` + `lists of contents` were summarized by LLaMA 13b. 4. Generated `summaries` + `split text segments` were transformed by LLaMA 13b into unprocessed JSONs. 5. All generated JSONs were validated and cleaned up. 6. Validated JSONs were reformatted into datasets that may be used for fine-tuning. ### Example of output data ```json { "section_description": "The article discusses the current reversal phenomenon in a classical deterministic ratchet system. The authors investigate the relationship between current and bifurcation diagrams, focusing on the dynamics of an ensemble of particles. They challenge Mateos' claim that current reversals occur only with bifurcations and present evidence for current reversals without bifurcations. Additionally, they show that bifurcations can occur without current reversals. The study highlights the importance of considering the characteristics of the ensemble in understanding the behavior of the system. The authors provide numerical evidence to support their claims and suggest that correlating abrupt changes in the current with bifurcations is more appropriate than focusing solely on current reversals.", "list_of_entities": [ "reversals", "mateos", "figures", "rules", "current_reversal", "ensemble", "bifurcation", "jumps", "thumb", "spikes", "current", "particles", "open_question", "behavior", "heuristics", "direction", "chaotic", "parameter" ], "relations": [ { "description": "bifurcations in single - trajectory behavior often corresponds to sudden spikes or jumps in the current for an ensemble in the same system", "source_entities": [ "bifurcation" ], "target_entities": [ "current" ] }, { "description": "current reversals are a special case of this", "source_entities": [ "current" ], "target_entities": [ "bifurcation" ] }, { "description": "not all spikes or jumps correspond to a bifurcation", "source_entities": [ "spikes" ], "target_entities": [ "bifurcation" ] }, { "description": "the open question is clearly to figure out if the reason for when these rules are violated or are valid can be made more concrete", "source_entities": [ "current" ], "target_entities": [ "open_question" ] } ] } ``` ### Expected output JSON schema ```json { "$schema": "extraction_schema.json", "type": "object", "properties": { "section_description": { "type": "string" } "list_of_entities": { "type": "array", "items": { "type": "string" } }, "relations": { "type": "array", "items": { "type": "object", "properties": { "description": { "type": "string" }, "source_entities": { "type": "array", "items": { "type": "string" } }, "target_entities": { "type": "array", "items": { "type": "string" } }, "strength": { "type": "string", "enum": ["strong", "moderate", "weak"] } }, "required": ["description", "source_entities", "target_entities"] } }, }, "required": ["list_of_entities", "relations", "section_description"] } ``` ### Example of preprocessed fine-tuning data This document details the preprocessing of fine-tuning data within the `llama2-prompts.csv` file, showcasing six distinct prompt formats designed to explore the optimal structure for training models on the task of semantic relation extraction: 1. `prompt_with_summary_and_schema`: Incorporates both a concise summary of the content and a structured schema outlining the expected JSON. 2. `prompt_with_summary`: Features a summary of the content without an explicit schema. 3. `prompt_with_merged_text`: Presents data as a unified text block, merging summary with extraction text. 4. `prompt_with_merged_text_and_schema`: Combines the merged text approach with a schema to guide the extraction process. 5. `prompt_no_summary_with_schema`: Excludes the summary but includes a schema, emphasizing the JSON structure. 6. `prompt_no_summary`: Provides the raw data without any summary or schema, offering the most unstructured form of the content. Model is expected to learn schema just from output. These variations are crafted from the same underlying data but are differentiated by their structural modifications or the omission of sections. Empirical testing is necessary to determine the degree of structure and guidance required for models to effectively learn and perform the extraction task. This approach aims to identify the optimal data presentation format that balances informational completeness with processing efficiency, thereby enhancing the model's learning effectiveness in semantic relation extraction. ```text Below is an summary and excerpt from an article. Your task is to extract information about entities and relations to the JSON format as follows: `json-schema { "$schema": "extraction_schema.json", "type": "object", "properties": { "section_description": { "type": "string" } "list_of_entities": { "type": "array", "items": { "type": "string" } }, "relations": { "type": "array", "items": { "type": "object", "properties": { "description": { "type": "string" }, "source_entities": { "type": "array", "items": { "type": "string" } }, "target_entities": { "type": "array", "items": { "type": "string" } }, "strength": { "type": "string", "enum": ["strong", "moderate", "weak"] } }, "required": ["description", "source_entities", "target_entities"] } }, }, "required": ["list_of_entities", "relations", "section_description"] } ` ### General Text Summary: The article investigates the generalized form factors of the nucleon within the framework of the chiral quark soliton model (CQSM). The study focuses on the pion mass dependence of final predictions and compares them with lattice QCD simulations carried out in the heavy pion region. The results reveal that some observables are highly sensitive to the variation of the pion mass, indicating that the negligible importance of quark orbital angular momentum found in the unrealistic heavy pion world may not hold true in the real world near the chiral limit. The article is divided into five sections: 1. Introduction: The authors introduce the topic and provide a brief overview of the study. 2. Model Lagrangian with Pion Mass Term: The authors present the CQSM Lagrangian with a pion mass term and explain its significance in the study. 3. Generalized Form Factors in the CQSM: The authors discuss the definition and properties of generalized form factors in the context of the CQSM. 4. Numerical Results and Discussions: The authors present the numerical results of their study and provide a detailed analysis of the pion mass dependence of final predictions. 5. Concluding Remarks: The authors summarize their findings and highlight the importance of considering the pion mass dependence in studies of the nucleon. Additionally, they prove the momentum sum rule for the generalized form factors. ### Text Part to Extract From: @xmath62 . note in particular in this figure that eyeball tests can be misleading . we see reversals without bifurcations in ( a ) whereas the zoomed version ( c ) shows that there are windows of periodic and chaotic regimes . this is further evidence that jumps in the current correspond in general to bifurcation.,title="fig:",width=302 ] for @xmath7 and @xmath79 , current ( upper ) and bifurcation diagram ( lower ) versus @xmath0.,title="fig:",width=302 ] however , a * different * rule of thumb , previously not proposed , emerges from our studies . this generalizes mateos conjecture to say that * ( iv ) bifurcations correspond to sudden current changes ( spikes or jumps)*. note that this means these changes in current are not necessarily reversals of direction . if this current jump or spike goes through zero , this coincides with a current reversal , making the mateos conjecture a special case . the physical basis of this argument is the fact that ensembles of particles in chaotic systems _ can _ have net directed transport but the details of this behavior depends relatively sensitively on the system parameters . this parameter dependence is greatly exaggerated at the bifurcation point , when the dynamics of the underlying single - particle system undergoes a transition a period - doubling transition , for example , or one from chaos to regular behavior . scanning the relevant figures , we see that this is a very useful rule of thumb . for example , it completely captures the behaviour of fig . ( [ figure6 ] ) which can not be understood as either an example of the mateos conjecture , or even a failure thereof . as such , this rule significantly enhances our ability to characterize changes in the behavior of the current as a function of parameter . a further example of where this modified conjecture helps us is in looking at a seeming negation of the mateos conjecture , that is , an example where we seem to see current - reversal without bifurcation , visible in fig . ( [ hidden - bifur ] ) . the current - reversals in that scan of parameter space seem to happen inside the chaotic regime and seemingly independent of bifurcation . however , this turns out to be a ` hidden ' bifurcation when we zoom in on the chaotic regime , we see hidden periodic windows . this is therefore consistent with our statement that sudden current changes are associated with bifurcations . each of the transitions from periodic behavior to chaos and back provides opportunities for the current to spike . however , in not all such cases can these hidden bifurcations be found . we can see an example of this in fig . ( [ rev - nobifur ] ) . the current is seen to move smoothly across @xmath80 with seemingly no corresponding bifurcations , even when we do a careful zoom on the data , as in fig . ( [ hidden - bifur ] ) . however , arguably , although subjective , this change is close to the bifurcation point . this result , that there are situations where the heuristics simply do not seem to apply , are part of the open questions associated with this problem , of course . we note , however , that we have seen that these broad arguments hold when we vary other parameters as well ( figures not shown here ) . in conclusion , in this paper we have taken the approach that it is useful to find general rules of thumb ( even if not universally valid ) to understand the complicated behavior of non - equilibrium nonlinear statistical mechanical systems . in the case of chaotic deterministic ratchets , we have shown that it is important to factor out issues of size , location , spread , and transience in computing the ` current ' due to an ensemble before we search for such rules , and that the dependence on ensemble characteristics is most critical near certain bifurcation points . we have then argued that the following heuristic characteristics hold : bifurcations in single - trajectory behavior often corresponds to sudden spikes or jumps in the current for an ensemble in the same system . current reversals are a special case of this . however , not all spikes or jumps correspond to a bifurcation , nor vice versa . the open question is clearly to figure out if the reason for when these rules are violated or are valid can be made more concrete . a.k . gratefully acknowledges t. barsch and kamal p. singh for stimulating discussions , the reimar lst grant and financial support from the alexander von humboldt foundation in bonn . a.k.p . is grateful to carleton college for the ` sit , wallin , and class of 1949 ' sabbatical fellowships , and to the mpip ### Extracted Relations: { "section_description": "The article discusses the current reversal phenomenon in a classical deterministic ratchet system. The authors investigate the relationship between current and bifurcation diagrams, focusing on the dynamics of an ensemble of particles. They challenge Mateos' claim that current reversals occur only with bifurcations and present evidence for current reversals without bifurcations. Additionally, they show that bifurcations can occur without current reversals. The study highlights the importance of considering the characteristics of the ensemble in understanding the behavior of the system. The authors provide numerical evidence to support their claims and suggest that correlating abrupt changes in the current with bifurcations is more appropriate than focusing solely on current reversals.", "list_of_entities": [ "reversals", "mateos", "figures", "rules", "current_reversal", "ensemble", "bifurcation", "jumps", "thumb", "spikes", "current", "particles", "open_question", "behavior", "heuristics", "direction", "chaotic", "parameter" ], "relations": [ { "description": "bifurcations in single - trajectory behavior often corresponds to sudden spikes or jumps in the current for an ensemble in the same system", "source_entities": [ "bifurcation" ], "target_entities": [ "current" ] }, { "description": "current reversals are a special case of this", "source_entities": [ "current" ], "target_entities": [ "bifurcation" ] }, { "description": "not all spikes or jumps correspond to a bifurcation", "source_entities": [ "spikes" ], "target_entities": [ "bifurcation" ] }, { "description": "the open question is clearly to figure out if the reason for when these rules are violated or are valid can be made more concrete", "source_entities": [ "current" ], "target_entities": [ "open_question" ] } ] } ``` ## Decisions 1. There is a whole section of the database with extracted relations and entities, mostly for estimating the connectivity and scale of the extracted data. 2. I chose `datasets/scientific_papers` as it already provided a good base for summaries (i.e., Abstracts) and did not require me to iteratively summarize all the contents, which would require additional time. 3. This project does not use ChatGPT or other external APIs; all processing was done locally on 2x3090RTX + some OrangePIs. The goal is to generate a fine-tuned model that can be hosted more cheaply, and also provide the same utility as this two-step LLaMA 13b process. OpenAI does not allow using the results of generation for fine-tuning other models; hence, all this data was generated locally with LLaMA 2, as the license permits improving LLaMA 2 with data generated with LLaMA 2. This is not perfect, but as long as I use `datasets/scientific_papers`, there is still the issue of licensing; it all will need to be regenerated in the future with a more open stack. 4. The goal is to create a small 3B-7B model that can be used for the task of extracting entities and semantic relations, which may be run on a small ARM board like OrangePI, with minimal cost at a reasonable speed. 5. I used LLaMA 2 Chat because, in the past, I was able to achieve the most stable results with that model. 6. I set the temperature to 0.7 to allow the model to infer some missing information and generate better summaries, but the trade-off of using a non-zero temperature is more involved result cleanup. Still, almost 88% of the generated data had a fixable structure. ## Future Plans for the Project 1. Fine-tune LLaMA 2 7B with synthetic data (try and evaluate the speed and quality of generation). 2. Generate more synthetic data, clean it, and fine-tune the model further. 3. Build a system for mixed querying of the data (I've built a prototype; now, I would like to recreate it as a whole standalone service). 4. After running it successfully, regenerate data based on the Wikipedia dataset or another fully open-source dataset, and replace LLaMA with a truly open-source model. ## Statistics 1. I ran the generation on 4 instances of LLaMA 2-chat on 2x3090RTX + i7 4790K. The processing averaged around 1 result per minute (either a summary or JSON). The whole process, excluding coding and experimentation, took approximately 20,000 minutes, which is roughly 14 days of compute time, and required about 120 kWh of power. In the near future, I need to upgrade the CPU + RAM to remove that bottleneck. ```bash ./run_llm_servers_for_data_generation.sh -n 4 -t 1 -m "models/llama-2-13b-chat.Q4_K_M.gguf" -c 4096 -b 1512 ``` 2. I tested hosting on ARM boards; a 13b model quantized to q4 was able to be hosted with stable speed for an extended time, achieving a speed of 2.34 tokens/s per one OrangePI. With an RTX 3090 paired with my somewhat outdated CPU, an i7 4790K, I was able to achieve up to 20 tokens/s. I have 5 OrangePIs 5 16GB, and by running models on all of them, I achieved around 11.7 tokens/s for approximately 50W of power. ### Use Cases The "Semantic Relations Extraction" dataset is ideal for researchers and developers aiming to create automated systems for extracting systematized knowledge from text. It is particularly useful for projects focused on building knowledge graphs, enhancing question answering systems, and developing tools for automated alerting based on semantic analysis. ### Licensing Information This dataset is derived from the `scientific_papers` dataset, which unfortunately does not have a clear license. Future plans involve regenerating the entire dataset using the Wikipedia dataset and fully open-source models to ensure broader accessibility and compliance with open-source licensing standards.
DehydratedWater42/semantic_relations_extraction
[ "task_categories:summarization", "task_categories:feature-extraction", "task_categories:text-generation", "task_categories:text2text-generation", "size_categories:1K<n<10K", "language:en", "math", "semantic", "extraction", "graph", "relations", "science", "synthetic", "region:us" ]
2024-02-12T15:34:46+00:00
{"language": ["en"], "size_categories": ["1K<n<10K"], "task_categories": ["summarization", "feature-extraction", "text-generation", "text2text-generation"], "pretty_name": "SemanticRelationsExtraction", "licence": ["license:unknown"], "tags": ["math", "semantic", "extraction", "graph", "relations", "science", "synthetic"], "configs": [{"config_name": "core_extracted_relations", "data_files": [{"split": "train", "path": ["core_extracted_relations.csv"]}], "default": true}, {"config_name": "extracted_relations", "data_files": [{"split": "train", "path": ["extracted_relations.csv"]}]}, {"config_name": "llama2_prompts", "data_files": [{"split": "train", "path": ["llama2_prompts.csv"]}]}]}
2024-02-14T15:02:02+00:00
[]
[ "en" ]
TAGS #task_categories-summarization #task_categories-feature-extraction #task_categories-text-generation #task_categories-text2text-generation #size_categories-1K<n<10K #language-English #math #semantic #extraction #graph #relations #science #synthetic #region-us
# Dataset Card for "Semantic Relations Extraction" ## Dataset Description ### Repository The "Semantic Relations Extraction" dataset is hosted on the Hugging Face platform, and was created with code from this GitHub repository. ### Purpose The "Semantic Relations Extraction" dataset was created for the purpose of fine-tuning smaller LLama2 (7B) models to speed up and reduce the costs of extracting semantic relations between entities in texts. This repository is part of a larger project aimed at creating a low-cost system for preprocessing documents in order to build a knowledge graph used for question answering and automated alerting. ### Data Sources The dataset was built using the following source: - 'datasets/scientific_papers' ### Files in the Dataset The repository contains 4 files: 1. 'extracted_relations.csv' -> A dataset of generated relations between entities containing columns for ['summary', 'article part', 'output json', 'database', 'abstract', 'list_of_contents']. 2. 'core_extracted_relations.csv' -> The same dataset but without the original abstracts and lists_of_contents. It contains columns for ['summary', 'article part', 'output json']. 3. 'llama2_prompts.csv' -> Multiple variants of the prompt with a response that can be used for fine-tuning the model. It is created by concatenating data in 'core_extracted_relations.csv'. 4. 'synthetic_data_12_02_24-URL' -> A backup of the whole PostgreSQL database used during data generation. It is the source for all the other files, exported by the 'airflow' user in custom format, with compression level 6 and UTF-8 encoding. ### Database Schema The dataset includes a database schema illustration, which provides an overview of how the data is organized within the database. !Database Schema ### GitHub Repository Synthetic data was generated using an Airflow data pipeline. The entire codebase can be accessed in this GitHub repository. ### Generation Process This data was generated based on the 'datasets/scientific_papers' dataset. This dataset contains a list of scientific articles with separate 'abstracts' and 'lists of contents'. Here is the synthetic data generation overview: 1. All the 'abstracts' and 'lists of contents' were inserted into the database. 2. The main content of every article was split into overlapping segments of 1k LLaMA tokens with a 200-token overlap. 3. 10k of the 'abstracts' + 'lists of contents' were summarized by LLaMA 13b. 4. Generated 'summaries' + 'split text segments' were transformed by LLaMA 13b into unprocessed JSONs. 5. All generated JSONs were validated and cleaned up. 6. Validated JSONs were reformatted into datasets that may be used for fine-tuning. ### Example of output data ### Expected output JSON schema ### Example of preprocessed fine-tuning data This document details the preprocessing of fine-tuning data within the 'URL' file, showcasing six distinct prompt formats designed to explore the optimal structure for training models on the task of semantic relation extraction: 1. 'prompt_with_summary_and_schema': Incorporates both a concise summary of the content and a structured schema outlining the expected JSON. 2. 'prompt_with_summary': Features a summary of the content without an explicit schema. 3. 'prompt_with_merged_text': Presents data as a unified text block, merging summary with extraction text. 4. 'prompt_with_merged_text_and_schema': Combines the merged text approach with a schema to guide the extraction process. 5. 'prompt_no_summary_with_schema': Excludes the summary but includes a schema, emphasizing the JSON structure. 6. 'prompt_no_summary': Provides the raw data without any summary or schema, offering the most unstructured form of the content. Model is expected to learn schema just from output. These variations are crafted from the same underlying data but are differentiated by their structural modifications or the omission of sections. Empirical testing is necessary to determine the degree of structure and guidance required for models to effectively learn and perform the extraction task. This approach aims to identify the optimal data presentation format that balances informational completeness with processing efficiency, thereby enhancing the model's learning effectiveness in semantic relation extraction. ## Decisions 1. There is a whole section of the database with extracted relations and entities, mostly for estimating the connectivity and scale of the extracted data. 2. I chose 'datasets/scientific_papers' as it already provided a good base for summaries (i.e., Abstracts) and did not require me to iteratively summarize all the contents, which would require additional time. 3. This project does not use ChatGPT or other external APIs; all processing was done locally on 2x3090RTX + some OrangePIs. The goal is to generate a fine-tuned model that can be hosted more cheaply, and also provide the same utility as this two-step LLaMA 13b process. OpenAI does not allow using the results of generation for fine-tuning other models; hence, all this data was generated locally with LLaMA 2, as the license permits improving LLaMA 2 with data generated with LLaMA 2. This is not perfect, but as long as I use 'datasets/scientific_papers', there is still the issue of licensing; it all will need to be regenerated in the future with a more open stack. 4. The goal is to create a small 3B-7B model that can be used for the task of extracting entities and semantic relations, which may be run on a small ARM board like OrangePI, with minimal cost at a reasonable speed. 5. I used LLaMA 2 Chat because, in the past, I was able to achieve the most stable results with that model. 6. I set the temperature to 0.7 to allow the model to infer some missing information and generate better summaries, but the trade-off of using a non-zero temperature is more involved result cleanup. Still, almost 88% of the generated data had a fixable structure. ## Future Plans for the Project 1. Fine-tune LLaMA 2 7B with synthetic data (try and evaluate the speed and quality of generation). 2. Generate more synthetic data, clean it, and fine-tune the model further. 3. Build a system for mixed querying of the data (I've built a prototype; now, I would like to recreate it as a whole standalone service). 4. After running it successfully, regenerate data based on the Wikipedia dataset or another fully open-source dataset, and replace LLaMA with a truly open-source model. ## Statistics 1. I ran the generation on 4 instances of LLaMA 2-chat on 2x3090RTX + i7 4790K. The processing averaged around 1 result per minute (either a summary or JSON). The whole process, excluding coding and experimentation, took approximately 20,000 minutes, which is roughly 14 days of compute time, and required about 120 kWh of power. In the near future, I need to upgrade the CPU + RAM to remove that bottleneck. 2. I tested hosting on ARM boards; a 13b model quantized to q4 was able to be hosted with stable speed for an extended time, achieving a speed of 2.34 tokens/s per one OrangePI. With an RTX 3090 paired with my somewhat outdated CPU, an i7 4790K, I was able to achieve up to 20 tokens/s. I have 5 OrangePIs 5 16GB, and by running models on all of them, I achieved around 11.7 tokens/s for approximately 50W of power. ### Use Cases The "Semantic Relations Extraction" dataset is ideal for researchers and developers aiming to create automated systems for extracting systematized knowledge from text. It is particularly useful for projects focused on building knowledge graphs, enhancing question answering systems, and developing tools for automated alerting based on semantic analysis. ### Licensing Information This dataset is derived from the 'scientific_papers' dataset, which unfortunately does not have a clear license. Future plans involve regenerating the entire dataset using the Wikipedia dataset and fully open-source models to ensure broader accessibility and compliance with open-source licensing standards.
[ "# Dataset Card for \"Semantic Relations Extraction\"", "## Dataset Description", "### Repository\nThe \"Semantic Relations Extraction\" dataset is hosted on the Hugging Face platform, and was created with code from this GitHub repository.", "### Purpose\nThe \"Semantic Relations Extraction\" dataset was created for the purpose of fine-tuning smaller LLama2 (7B) models to speed up and reduce the costs of extracting semantic relations between entities in texts. This repository is part of a larger project aimed at creating a low-cost system for preprocessing documents in order to build a knowledge graph used for question answering and automated alerting.", "### Data Sources\nThe dataset was built using the following source:\n- 'datasets/scientific_papers'", "### Files in the Dataset\nThe repository contains 4 files:\n1. 'extracted_relations.csv' -> A dataset of generated relations between entities containing columns for ['summary', 'article part', 'output json', 'database', 'abstract', 'list_of_contents'].\n2. 'core_extracted_relations.csv' -> The same dataset but without the original abstracts and lists_of_contents. It contains columns for ['summary', 'article part', 'output json'].\n3. 'llama2_prompts.csv' -> Multiple variants of the prompt with a response that can be used for fine-tuning the model. It is created by concatenating data in 'core_extracted_relations.csv'.\n4. 'synthetic_data_12_02_24-URL' -> A backup of the whole PostgreSQL database used during data generation. It is the source for all the other files, exported by the 'airflow' user in custom format, with compression level 6 and UTF-8 encoding.", "### Database Schema\nThe dataset includes a database schema illustration, which provides an overview of how the data is organized within the database.\n\n!Database Schema", "### GitHub Repository\nSynthetic data was generated using an Airflow data pipeline. The entire codebase can be accessed in this GitHub repository.", "### Generation Process\nThis data was generated based on the 'datasets/scientific_papers' dataset. This dataset contains a list of scientific articles with separate 'abstracts' and 'lists of contents'. Here is the synthetic data generation overview:\n\n1. All the 'abstracts' and 'lists of contents' were inserted into the database.\n2. The main content of every article was split into overlapping segments of 1k LLaMA tokens with a 200-token overlap.\n3. 10k of the 'abstracts' + 'lists of contents' were summarized by LLaMA 13b.\n4. Generated 'summaries' + 'split text segments' were transformed by LLaMA 13b into unprocessed JSONs.\n5. All generated JSONs were validated and cleaned up.\n6. Validated JSONs were reformatted into datasets that may be used for fine-tuning.", "### Example of output data", "### Expected output JSON schema", "### Example of preprocessed fine-tuning data\nThis document details the preprocessing of fine-tuning data within the 'URL' file, showcasing six distinct prompt formats designed to explore the optimal structure for training models on the task of semantic relation extraction:\n1. 'prompt_with_summary_and_schema': Incorporates both a concise summary of the content and a structured schema outlining the expected JSON.\n2. 'prompt_with_summary': Features a summary of the content without an explicit schema.\n3. 'prompt_with_merged_text': Presents data as a unified text block, merging summary with extraction text.\n4. 'prompt_with_merged_text_and_schema': Combines the merged text approach with a schema to guide the extraction process.\n5. 'prompt_no_summary_with_schema': Excludes the summary but includes a schema, emphasizing the JSON structure.\n6. 'prompt_no_summary': Provides the raw data without any summary or schema, offering the most unstructured form of the content. Model is expected to learn schema just from output.\n\nThese variations are crafted from the same underlying data but are differentiated by their structural modifications or the omission of sections. Empirical testing is necessary to determine the degree of structure and guidance required for models to effectively learn and perform the extraction task. This approach aims to identify the optimal data presentation format that balances informational completeness with processing efficiency, thereby enhancing the model's learning effectiveness in semantic relation extraction.", "## Decisions\n1. There is a whole section of the database with extracted relations and entities, mostly for estimating the connectivity and scale of the extracted data.\n2. I chose 'datasets/scientific_papers' as it already provided a good base for summaries (i.e., Abstracts) and did not require me to iteratively summarize all the contents, which would require additional time.\n3. This project does not use ChatGPT or other external APIs; all processing was done locally on 2x3090RTX + some OrangePIs. The goal is to generate a fine-tuned model that can be hosted more cheaply, and also provide the same utility as this two-step LLaMA 13b process. OpenAI does not allow using the results of generation for fine-tuning other models; hence, all this data was generated locally with LLaMA 2, as the license permits improving LLaMA 2 with data generated with LLaMA 2. This is not perfect, but as long as I use 'datasets/scientific_papers', there is still the issue of licensing; it all will need to be regenerated in the future with a more open stack.\n4. The goal is to create a small 3B-7B model that can be used for the task of extracting entities and semantic relations, which may be run on a small ARM board like OrangePI, with minimal cost at a reasonable speed.\n5. I used LLaMA 2 Chat because, in the past, I was able to achieve the most stable results with that model.\n6. I set the temperature to 0.7 to allow the model to infer some missing information and generate better summaries, but the trade-off of using a non-zero temperature is more involved result cleanup. Still, almost 88% of the generated data had a fixable structure.", "## Future Plans for the Project\n1. Fine-tune LLaMA 2 7B with synthetic data (try and evaluate the speed and quality of generation).\n2. Generate more synthetic data, clean it, and fine-tune the model further.\n3. Build a system for mixed querying of the data (I've built a prototype; now, I would like to recreate it as a whole standalone service).\n4. After running it successfully, regenerate data based on the Wikipedia dataset or another fully open-source dataset, and replace LLaMA with a truly open-source model.", "## Statistics\n1. I ran the generation on 4 instances of LLaMA 2-chat on 2x3090RTX + i7 4790K. The processing averaged around 1 result per minute (either a summary or JSON). The whole process, excluding coding and experimentation, took approximately 20,000 minutes, which is roughly 14 days of compute time, and required about 120 kWh of power. In the near future, I need to upgrade the CPU + RAM to remove that bottleneck.\n\n2. I tested hosting on ARM boards; a 13b model quantized to q4 was able to be hosted with stable speed for an extended time, achieving a speed of 2.34 tokens/s per one OrangePI. With an RTX 3090 paired with my somewhat outdated CPU, an i7 4790K, I was able to achieve up to 20 tokens/s. I have 5 OrangePIs 5 16GB, and by running models on all of them, I achieved around 11.7 tokens/s for approximately 50W of power.", "### Use Cases\nThe \"Semantic Relations Extraction\" dataset is ideal for researchers and developers aiming to create automated systems for extracting systematized knowledge from text. It is particularly useful for projects focused on building knowledge graphs, enhancing question answering systems, and developing tools for automated alerting based on semantic analysis.", "### Licensing Information\nThis dataset is derived from the 'scientific_papers' dataset, which unfortunately does not have a clear license. Future plans involve regenerating the entire dataset using the Wikipedia dataset and fully open-source models to ensure broader accessibility and compliance with open-source licensing standards." ]
[ "TAGS\n#task_categories-summarization #task_categories-feature-extraction #task_categories-text-generation #task_categories-text2text-generation #size_categories-1K<n<10K #language-English #math #semantic #extraction #graph #relations #science #synthetic #region-us \n", "# Dataset Card for \"Semantic Relations Extraction\"", "## Dataset Description", "### Repository\nThe \"Semantic Relations Extraction\" dataset is hosted on the Hugging Face platform, and was created with code from this GitHub repository.", "### Purpose\nThe \"Semantic Relations Extraction\" dataset was created for the purpose of fine-tuning smaller LLama2 (7B) models to speed up and reduce the costs of extracting semantic relations between entities in texts. This repository is part of a larger project aimed at creating a low-cost system for preprocessing documents in order to build a knowledge graph used for question answering and automated alerting.", "### Data Sources\nThe dataset was built using the following source:\n- 'datasets/scientific_papers'", "### Files in the Dataset\nThe repository contains 4 files:\n1. 'extracted_relations.csv' -> A dataset of generated relations between entities containing columns for ['summary', 'article part', 'output json', 'database', 'abstract', 'list_of_contents'].\n2. 'core_extracted_relations.csv' -> The same dataset but without the original abstracts and lists_of_contents. It contains columns for ['summary', 'article part', 'output json'].\n3. 'llama2_prompts.csv' -> Multiple variants of the prompt with a response that can be used for fine-tuning the model. It is created by concatenating data in 'core_extracted_relations.csv'.\n4. 'synthetic_data_12_02_24-URL' -> A backup of the whole PostgreSQL database used during data generation. It is the source for all the other files, exported by the 'airflow' user in custom format, with compression level 6 and UTF-8 encoding.", "### Database Schema\nThe dataset includes a database schema illustration, which provides an overview of how the data is organized within the database.\n\n!Database Schema", "### GitHub Repository\nSynthetic data was generated using an Airflow data pipeline. The entire codebase can be accessed in this GitHub repository.", "### Generation Process\nThis data was generated based on the 'datasets/scientific_papers' dataset. This dataset contains a list of scientific articles with separate 'abstracts' and 'lists of contents'. Here is the synthetic data generation overview:\n\n1. All the 'abstracts' and 'lists of contents' were inserted into the database.\n2. The main content of every article was split into overlapping segments of 1k LLaMA tokens with a 200-token overlap.\n3. 10k of the 'abstracts' + 'lists of contents' were summarized by LLaMA 13b.\n4. Generated 'summaries' + 'split text segments' were transformed by LLaMA 13b into unprocessed JSONs.\n5. All generated JSONs were validated and cleaned up.\n6. Validated JSONs were reformatted into datasets that may be used for fine-tuning.", "### Example of output data", "### Expected output JSON schema", "### Example of preprocessed fine-tuning data\nThis document details the preprocessing of fine-tuning data within the 'URL' file, showcasing six distinct prompt formats designed to explore the optimal structure for training models on the task of semantic relation extraction:\n1. 'prompt_with_summary_and_schema': Incorporates both a concise summary of the content and a structured schema outlining the expected JSON.\n2. 'prompt_with_summary': Features a summary of the content without an explicit schema.\n3. 'prompt_with_merged_text': Presents data as a unified text block, merging summary with extraction text.\n4. 'prompt_with_merged_text_and_schema': Combines the merged text approach with a schema to guide the extraction process.\n5. 'prompt_no_summary_with_schema': Excludes the summary but includes a schema, emphasizing the JSON structure.\n6. 'prompt_no_summary': Provides the raw data without any summary or schema, offering the most unstructured form of the content. Model is expected to learn schema just from output.\n\nThese variations are crafted from the same underlying data but are differentiated by their structural modifications or the omission of sections. Empirical testing is necessary to determine the degree of structure and guidance required for models to effectively learn and perform the extraction task. This approach aims to identify the optimal data presentation format that balances informational completeness with processing efficiency, thereby enhancing the model's learning effectiveness in semantic relation extraction.", "## Decisions\n1. There is a whole section of the database with extracted relations and entities, mostly for estimating the connectivity and scale of the extracted data.\n2. I chose 'datasets/scientific_papers' as it already provided a good base for summaries (i.e., Abstracts) and did not require me to iteratively summarize all the contents, which would require additional time.\n3. This project does not use ChatGPT or other external APIs; all processing was done locally on 2x3090RTX + some OrangePIs. The goal is to generate a fine-tuned model that can be hosted more cheaply, and also provide the same utility as this two-step LLaMA 13b process. OpenAI does not allow using the results of generation for fine-tuning other models; hence, all this data was generated locally with LLaMA 2, as the license permits improving LLaMA 2 with data generated with LLaMA 2. This is not perfect, but as long as I use 'datasets/scientific_papers', there is still the issue of licensing; it all will need to be regenerated in the future with a more open stack.\n4. The goal is to create a small 3B-7B model that can be used for the task of extracting entities and semantic relations, which may be run on a small ARM board like OrangePI, with minimal cost at a reasonable speed.\n5. I used LLaMA 2 Chat because, in the past, I was able to achieve the most stable results with that model.\n6. I set the temperature to 0.7 to allow the model to infer some missing information and generate better summaries, but the trade-off of using a non-zero temperature is more involved result cleanup. Still, almost 88% of the generated data had a fixable structure.", "## Future Plans for the Project\n1. Fine-tune LLaMA 2 7B with synthetic data (try and evaluate the speed and quality of generation).\n2. Generate more synthetic data, clean it, and fine-tune the model further.\n3. Build a system for mixed querying of the data (I've built a prototype; now, I would like to recreate it as a whole standalone service).\n4. After running it successfully, regenerate data based on the Wikipedia dataset or another fully open-source dataset, and replace LLaMA with a truly open-source model.", "## Statistics\n1. I ran the generation on 4 instances of LLaMA 2-chat on 2x3090RTX + i7 4790K. The processing averaged around 1 result per minute (either a summary or JSON). The whole process, excluding coding and experimentation, took approximately 20,000 minutes, which is roughly 14 days of compute time, and required about 120 kWh of power. In the near future, I need to upgrade the CPU + RAM to remove that bottleneck.\n\n2. I tested hosting on ARM boards; a 13b model quantized to q4 was able to be hosted with stable speed for an extended time, achieving a speed of 2.34 tokens/s per one OrangePI. With an RTX 3090 paired with my somewhat outdated CPU, an i7 4790K, I was able to achieve up to 20 tokens/s. I have 5 OrangePIs 5 16GB, and by running models on all of them, I achieved around 11.7 tokens/s for approximately 50W of power.", "### Use Cases\nThe \"Semantic Relations Extraction\" dataset is ideal for researchers and developers aiming to create automated systems for extracting systematized knowledge from text. It is particularly useful for projects focused on building knowledge graphs, enhancing question answering systems, and developing tools for automated alerting based on semantic analysis.", "### Licensing Information\nThis dataset is derived from the 'scientific_papers' dataset, which unfortunately does not have a clear license. Future plans involve regenerating the entire dataset using the Wikipedia dataset and fully open-source models to ensure broader accessibility and compliance with open-source licensing standards." ]
41007204901034d5eee678e12923ac2ef762a72a
# Dataset Card for "german-law-dataset-with-embeddings" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
ftopal/german-law-dataset-with-embeddings
[ "region:us" ]
2024-02-12T15:36:47+00:00
{"dataset_info": {"features": [{"name": "page_url", "dtype": "string"}, {"name": "law_page_url", "dtype": "string"}, {"name": "text", "dtype": "string"}, {"name": "sections", "sequence": "string"}, {"name": "embeddings", "sequence": "float32"}, {"name": "section_embeddings", "sequence": {"sequence": "float32"}}], "splits": [{"name": "train", "num_bytes": 661003314, "num_examples": 6746}], "download_size": 503817943, "dataset_size": 661003314}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-02-12T17:24:03+00:00
[]
[]
TAGS #region-us
# Dataset Card for "german-law-dataset-with-embeddings" More Information needed
[ "# Dataset Card for \"german-law-dataset-with-embeddings\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"german-law-dataset-with-embeddings\"\n\nMore Information needed" ]
63f7224a368616ad7cab99fadb5a59070a66c0b1
# Dataset Card for Evaluation run of paulml/DPOB-INMTOB-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [paulml/DPOB-INMTOB-7B](https://huggingface.co/paulml/DPOB-INMTOB-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_paulml__DPOB-INMTOB-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-12T16:00:35.894215](https://huggingface.co/datasets/open-llm-leaderboard/details_paulml__DPOB-INMTOB-7B/blob/main/results_2024-02-12T16-00-35.894215.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6517696346376387, "acc_stderr": 0.03199085355554918, "acc_norm": 0.6509872024369695, "acc_norm_stderr": 0.03266122254263512, "mc1": 0.6230110159118727, "mc1_stderr": 0.01696551757893035, "mc2": 0.7660075946329132, "mc2_stderr": 0.01398886238985396 }, "harness|arc:challenge|25": { "acc": 0.7175767918088737, "acc_stderr": 0.013155456884097224, "acc_norm": 0.7320819112627986, "acc_norm_stderr": 0.012942030195136437 }, "harness|hellaswag|10": { "acc": 0.7150965943039235, "acc_stderr": 0.004504459553909768, "acc_norm": 0.8899621589324835, "acc_norm_stderr": 0.0031229736320394696 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.04153948404742398, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.04153948404742398 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.03715062154998904, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.03715062154998904 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6943396226415094, "acc_stderr": 0.028353298073322663, "acc_norm": 0.6943396226415094, "acc_norm_stderr": 0.028353298073322663 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7708333333333334, "acc_stderr": 0.03514697467862388, "acc_norm": 0.7708333333333334, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.55, "acc_stderr": 0.05, "acc_norm": 0.55, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6589595375722543, "acc_stderr": 0.03614665424180826, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.03614665424180826 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4019607843137255, "acc_stderr": 0.04878608714466996, "acc_norm": 0.4019607843137255, "acc_norm_stderr": 0.04878608714466996 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.574468085106383, "acc_stderr": 0.03232146916224468, "acc_norm": 0.574468085106383, "acc_norm_stderr": 0.03232146916224468 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4649122807017544, "acc_stderr": 0.046920083813689104, "acc_norm": 0.4649122807017544, "acc_norm_stderr": 0.046920083813689104 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41534391534391535, "acc_stderr": 0.025379524910778398, "acc_norm": 0.41534391534391535, "acc_norm_stderr": 0.025379524910778398 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.49206349206349204, "acc_stderr": 0.044715725362943486, "acc_norm": 0.49206349206349204, "acc_norm_stderr": 0.044715725362943486 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7870967741935484, "acc_stderr": 0.02328766512726854, "acc_norm": 0.7870967741935484, "acc_norm_stderr": 0.02328766512726854 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8080808080808081, "acc_stderr": 0.028057791672989017, "acc_norm": 0.8080808080808081, "acc_norm_stderr": 0.028057791672989017 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.917098445595855, "acc_stderr": 0.01989934131572178, "acc_norm": 0.917098445595855, "acc_norm_stderr": 0.01989934131572178 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6666666666666666, "acc_stderr": 0.023901157979402534, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.023901157979402534 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.028493465091028593, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.028493465091028593 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.680672268907563, "acc_stderr": 0.030283995525884396, "acc_norm": 0.680672268907563, "acc_norm_stderr": 0.030283995525884396 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.36423841059602646, "acc_stderr": 0.03929111781242742, "acc_norm": 0.36423841059602646, "acc_norm_stderr": 0.03929111781242742 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8422018348623853, "acc_stderr": 0.01563002297009244, "acc_norm": 0.8422018348623853, "acc_norm_stderr": 0.01563002297009244 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5, "acc_stderr": 0.034099716973523674, "acc_norm": 0.5, "acc_norm_stderr": 0.034099716973523674 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8431372549019608, "acc_stderr": 0.02552472232455335, "acc_norm": 0.8431372549019608, "acc_norm_stderr": 0.02552472232455335 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.810126582278481, "acc_stderr": 0.02553010046023349, "acc_norm": 0.810126582278481, "acc_norm_stderr": 0.02553010046023349 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8091603053435115, "acc_stderr": 0.03446513350752598, "acc_norm": 0.8091603053435115, "acc_norm_stderr": 0.03446513350752598 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7603305785123967, "acc_stderr": 0.03896878985070416, "acc_norm": 0.7603305785123967, "acc_norm_stderr": 0.03896878985070416 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252626, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252626 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7852760736196319, "acc_stderr": 0.032262193772867744, "acc_norm": 0.7852760736196319, "acc_norm_stderr": 0.032262193772867744 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.41964285714285715, "acc_stderr": 0.04684099321077106, "acc_norm": 0.41964285714285715, "acc_norm_stderr": 0.04684099321077106 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406964, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406964 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8263090676883781, "acc_stderr": 0.01354741565866226, "acc_norm": 0.8263090676883781, "acc_norm_stderr": 0.01354741565866226 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7341040462427746, "acc_stderr": 0.02378620325550829, "acc_norm": 0.7341040462427746, "acc_norm_stderr": 0.02378620325550829 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4446927374301676, "acc_stderr": 0.01661988198817702, "acc_norm": 0.4446927374301676, "acc_norm_stderr": 0.01661988198817702 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7287581699346405, "acc_stderr": 0.025457756696667878, "acc_norm": 0.7287581699346405, "acc_norm_stderr": 0.025457756696667878 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7106109324758842, "acc_stderr": 0.02575586592263295, "acc_norm": 0.7106109324758842, "acc_norm_stderr": 0.02575586592263295 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7438271604938271, "acc_stderr": 0.0242885336377261, "acc_norm": 0.7438271604938271, "acc_norm_stderr": 0.0242885336377261 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4858156028368794, "acc_stderr": 0.02981549448368206, "acc_norm": 0.4858156028368794, "acc_norm_stderr": 0.02981549448368206 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47131681877444587, "acc_stderr": 0.012749206007657473, "acc_norm": 0.47131681877444587, "acc_norm_stderr": 0.012749206007657473 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6838235294117647, "acc_stderr": 0.02824568739146292, "acc_norm": 0.6838235294117647, "acc_norm_stderr": 0.02824568739146292 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6699346405228758, "acc_stderr": 0.019023726160724553, "acc_norm": 0.6699346405228758, "acc_norm_stderr": 0.019023726160724553 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8507462686567164, "acc_stderr": 0.02519692987482706, "acc_norm": 0.8507462686567164, "acc_norm_stderr": 0.02519692987482706 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774709, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774709 }, "harness|hendrycksTest-virology|5": { "acc": 0.5602409638554217, "acc_stderr": 0.03864139923699122, "acc_norm": 0.5602409638554217, "acc_norm_stderr": 0.03864139923699122 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.027966785859160893, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.027966785859160893 }, "harness|truthfulqa:mc|0": { "mc1": 0.6230110159118727, "mc1_stderr": 0.01696551757893035, "mc2": 0.7660075946329132, "mc2_stderr": 0.01398886238985396 }, "harness|winogrande|5": { "acc": 0.8468823993685872, "acc_stderr": 0.010120623252272955 }, "harness|gsm8k|5": { "acc": 0.6921910538286581, "acc_stderr": 0.012714401009923649 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_paulml__DPOB-INMTOB-7B
[ "region:us" ]
2024-02-12T16:02:57+00:00
{"pretty_name": "Evaluation run of paulml/DPOB-INMTOB-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [paulml/DPOB-INMTOB-7B](https://huggingface.co/paulml/DPOB-INMTOB-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_paulml__DPOB-INMTOB-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T16:00:35.894215](https://huggingface.co/datasets/open-llm-leaderboard/details_paulml__DPOB-INMTOB-7B/blob/main/results_2024-02-12T16-00-35.894215.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6517696346376387,\n \"acc_stderr\": 0.03199085355554918,\n \"acc_norm\": 0.6509872024369695,\n \"acc_norm_stderr\": 0.03266122254263512,\n \"mc1\": 0.6230110159118727,\n \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.7660075946329132,\n \"mc2_stderr\": 0.01398886238985396\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7175767918088737,\n \"acc_stderr\": 0.013155456884097224,\n \"acc_norm\": 0.7320819112627986,\n \"acc_norm_stderr\": 0.012942030195136437\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7150965943039235,\n \"acc_stderr\": 0.004504459553909768,\n \"acc_norm\": 0.8899621589324835,\n \"acc_norm_stderr\": 0.0031229736320394696\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778398,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778398\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.02328766512726854,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.02328766512726854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4446927374301676,\n \"acc_stderr\": 0.01661988198817702,\n \"acc_norm\": 0.4446927374301676,\n \"acc_norm_stderr\": 0.01661988198817702\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667878,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667878\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.02575586592263295,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.02575586592263295\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47131681877444587,\n \"acc_stderr\": 0.012749206007657473,\n \"acc_norm\": 0.47131681877444587,\n \"acc_norm_stderr\": 0.012749206007657473\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482706,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6230110159118727,\n \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.7660075946329132,\n \"mc2_stderr\": 0.01398886238985396\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8468823993685872,\n \"acc_stderr\": 0.010120623252272955\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6921910538286581,\n \"acc_stderr\": 0.012714401009923649\n }\n}\n```", "repo_url": "https://huggingface.co/paulml/DPOB-INMTOB-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|arc:challenge|25_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|gsm8k|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hellaswag|10_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T16-00-35.894215.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["**/details_harness|winogrande|5_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T16-00-35.894215.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T16_00_35.894215", "path": ["results_2024-02-12T16-00-35.894215.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T16-00-35.894215.parquet"]}]}]}
2024-02-12T16:03:25+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of paulml/DPOB-INMTOB-7B Dataset automatically created during the evaluation run of model paulml/DPOB-INMTOB-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-12T16:00:35.894215(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of paulml/DPOB-INMTOB-7B\n\n\n\nDataset automatically created during the evaluation run of model paulml/DPOB-INMTOB-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T16:00:35.894215(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of paulml/DPOB-INMTOB-7B\n\n\n\nDataset automatically created during the evaluation run of model paulml/DPOB-INMTOB-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T16:00:35.894215(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
834ee78e41f2d68c0161fa2a93f308fc4cbf8983
# distilabel-truthy-dpo-v0.1 A DPO dataset built with [distilabel](https://github.com/argilla-io/distilabel) on top of Jon Durbin's [jondurbin/truthy-dpo-v0.1](https://huggingface.co/datasets/jondurbin/truthy-dpo-v0.1) dataset. Interestingly, it swaps a lot of chosen and rejected answers. <p align="center"> <a href="https://github.com/argilla-io/distilabel"> <img src="https://raw.githubusercontent.com/argilla-io/distilabel/main/docs/assets/distilabel-badge-light.png" alt="Built with Distilabel" width="200" height="32"/> </a> </p>
mlabonne/distilabel-truthy-dpo-v0.1
[ "region:us" ]
2024-02-12T16:07:14+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "system", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "chosen", "dtype": "string"}, {"name": "rejected", "dtype": "string"}, {"name": "generations", "sequence": "string"}, {"name": "order", "sequence": "string"}, {"name": "labelling_model", "dtype": "string"}, {"name": "labelling_prompt", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "raw_labelling_response", "dtype": "string"}, {"name": "rating", "sequence": "float64"}, {"name": "rationale", "dtype": "string"}, {"name": "status", "dtype": "string"}, {"name": "original_chosen", "dtype": "string"}, {"name": "original_rejected", "dtype": "string"}, {"name": "chosen_score", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 7956438, "num_examples": 1016}], "download_size": 3717532, "dataset_size": 7956438}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-02-12T19:59:01+00:00
[]
[]
TAGS #region-us
# distilabel-truthy-dpo-v0.1 A DPO dataset built with distilabel on top of Jon Durbin's jondurbin/truthy-dpo-v0.1 dataset. Interestingly, it swaps a lot of chosen and rejected answers. <p align="center"> <a href="URL <img src="URL alt="Built with Distilabel" width="200" height="32"/> </a> </p>
[ "# distilabel-truthy-dpo-v0.1\n\nA DPO dataset built with distilabel on top of Jon Durbin's jondurbin/truthy-dpo-v0.1 dataset.\n\nInterestingly, it swaps a lot of chosen and rejected answers.\n\n<p align=\"center\">\n <a href=\"URL\n <img src=\"URL alt=\"Built with Distilabel\" width=\"200\" height=\"32\"/>\n </a>\n</p>" ]
[ "TAGS\n#region-us \n", "# distilabel-truthy-dpo-v0.1\n\nA DPO dataset built with distilabel on top of Jon Durbin's jondurbin/truthy-dpo-v0.1 dataset.\n\nInterestingly, it swaps a lot of chosen and rejected answers.\n\n<p align=\"center\">\n <a href=\"URL\n <img src=\"URL alt=\"Built with Distilabel\" width=\"200\" height=\"32\"/>\n </a>\n</p>" ]
dd45d74f6d3d4344a3a6d8f67f9be2350d568947
# Boletín Oficial de la República Argentina Este dataset se actualiza diariamente a través de [argentina.gob.ar](https://www.argentina.gob.ar/normativa), usando la [librería de SandboxAI](https://github.com/sandbox-ai/Boletin-Oficial-Argentina) # Formato El formato del dataset es el siguiente: ```json { "title":"Título resumido de la entrada", "name":"Nombre asignado", "entity":"Entidad gubernamental que la emite", "content":"Contenido de la entrada", "date":"Fecha publicada" } ```
marianbasti/boletin-oficial-argentina
[ "size_categories:100K<n<1M", "language:es", "license:apache-2.0", "argentina", "law", "government", "region:us" ]
2024-02-12T16:20:31+00:00
{"language": ["es"], "license": "apache-2.0", "size_categories": ["100K<n<1M"], "pretty_name": "Bolet\u00edn Oficial de la Rep\u00fablica Argentina", "tags": ["argentina", "law", "government"]}
2024-02-17T01:00:10+00:00
[]
[ "es" ]
TAGS #size_categories-100K<n<1M #language-Spanish #license-apache-2.0 #argentina #law #government #region-us
# Boletín Oficial de la República Argentina Este dataset se actualiza diariamente a través de URL, usando la librería de SandboxAI # Formato El formato del dataset es el siguiente:
[ "# Boletín Oficial de la República Argentina\nEste dataset se actualiza diariamente a través de URL, usando la librería de SandboxAI", "# Formato\nEl formato del dataset es el siguiente:" ]
[ "TAGS\n#size_categories-100K<n<1M #language-Spanish #license-apache-2.0 #argentina #law #government #region-us \n", "# Boletín Oficial de la República Argentina\nEste dataset se actualiza diariamente a través de URL, usando la librería de SandboxAI", "# Formato\nEl formato del dataset es el siguiente:" ]
3a28593851c73ba221a40fb1674b993b8d5670f9
**Code-290k-ShareGPT-Vicuna** This dataset is in Vicuna/ShareGPT format. There are around 290000 set of conversations. Each set having 2 conversations. Along with Python, Java, JavaScript, GO, C++, Rust, Ruby, Sql, MySql, R, Julia, Haskell, etc. code with detailed explanation are provided. This datset is built upon using my existing Datasets [Python-Code-23k-ShareGPT](https://huggingface.co/datasets/ajibawa-2023/Python-Code-23k-ShareGPT) and [Code-74k-ShareGPT](https://huggingface.co/datasets/ajibawa-2023/Code-74k-ShareGPT).
cognitivecomputations/Code-290k-ShareGPT-Vicuna
[ "size_categories:100K<n<1M", "language:en", "license:apache-2.0", "code", "region:us" ]
2024-02-12T16:56:12+00:00
{"language": ["en"], "license": "apache-2.0", "size_categories": ["100K<n<1M"], "tags": ["code"]}
2024-02-12T17:01:45+00:00
[]
[ "en" ]
TAGS #size_categories-100K<n<1M #language-English #license-apache-2.0 #code #region-us
Code-290k-ShareGPT-Vicuna This dataset is in Vicuna/ShareGPT format. There are around 290000 set of conversations. Each set having 2 conversations. Along with Python, Java, JavaScript, GO, C++, Rust, Ruby, Sql, MySql, R, Julia, Haskell, etc. code with detailed explanation are provided. This datset is built upon using my existing Datasets Python-Code-23k-ShareGPT and Code-74k-ShareGPT.
[]
[ "TAGS\n#size_categories-100K<n<1M #language-English #license-apache-2.0 #code #region-us \n" ]
6cc5b0854bd84e40c0670c5a7a329cea02318d5d
annotations_creators: - expert-generated language_creators: - expert-generated language: - en license: - cc-by-nc-nd-4.0 multilinguality: - monolingual size_categories: - 10K<n<100K source_datasets: - original task_categories: - summarization task_ids: [] paperswithcode_id: samsum-corpus pretty_name: SAMSum Corpus tags: - conversations-summarization dataset_info: features: - name: id dtype: string - name: dialogue dtype: string - name: summary dtype: string config_name: samsum splits: - name: train num_bytes: 9479141 num_examples: 14732 - name: validation num_bytes: 516431 num_examples: 818 download_size: 2944100 dataset_size: 10530064 train-eval-index: - config: samsum task: summarization task_id: summarization splits: eval_split: test col_mapping: dialogue: text summary: target
longAtSJSU/reversedDataset
[ "task_categories:text-classification", "size_categories:1K<n<10K", "language:en", "license:apache-2.0", "region:us" ]
2024-02-12T16:58:44+00:00
{"language": ["en"], "license": "apache-2.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-classification"]}
2024-02-14T23:08:19+00:00
[]
[ "en" ]
TAGS #task_categories-text-classification #size_categories-1K<n<10K #language-English #license-apache-2.0 #region-us
annotations_creators: - expert-generated language_creators: - expert-generated language: - en license: - cc-by-nc-nd-4.0 multilinguality: - monolingual size_categories: - 10K<n<100K source_datasets: - original task_categories: - summarization task_ids: [] paperswithcode_id: samsum-corpus pretty_name: SAMSum Corpus tags: - conversations-summarization dataset_info: features: - name: id dtype: string - name: dialogue dtype: string - name: summary dtype: string config_name: samsum splits: - name: train num_bytes: 9479141 num_examples: 14732 - name: validation num_bytes: 516431 num_examples: 818 download_size: 2944100 dataset_size: 10530064 train-eval-index: - config: samsum task: summarization task_id: summarization splits: eval_split: test col_mapping: dialogue: text summary: target
[]
[ "TAGS\n#task_categories-text-classification #size_categories-1K<n<10K #language-English #license-apache-2.0 #region-us \n" ]
4271316a80191d3970b9e415b1af4eacdbf1d9de
# Dataset of Lisara Restall (So, I Can't Play H!) This is the dataset of Lisara Restall (So, I Can't Play H!), containing 426 images and their tags. The core tags of this character are `long_hair, red_hair, red_eyes`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 426 | 400.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lisara_restall_soicantplayh/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 426 | 293.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lisara_restall_soicantplayh/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 831 | 533.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lisara_restall_soicantplayh/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 426 | 399.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lisara_restall_soicantplayh/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 831 | 694.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lisara_restall_soicantplayh/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/lisara_restall_soicantplayh', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 33 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, bare_shoulders, sleeveless, red_rose, anime_coloring, hair_between_eyes, upper_body, looking_at_viewer | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, anime_coloring, solo, hair_between_eyes, parody, looking_at_viewer, open_mouth, portrait | | 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, solo, anime_coloring, serafuku, hair_between_eyes | | 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, anime_coloring, hair_between_eyes, open_mouth, shiny_hair, solo, bangs, collarbone, portrait, bare_shoulders, blush, looking_at_viewer | | 4 | 12 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, skirt, very_long_hair, serafuku, solo, zettai_ryouiki, black_thighhighs | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1girl, skirt, sleeveless, solo, very_long_hair, thighhighs, zettai_ryouiki, hair_down | | 6 | 13 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, elbow_gloves, solo, zettai_ryouiki, very_long_hair, black_thighhighs, choker, dress, weapon, black_gloves, rain, scythe, dark | | 7 | 5 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, dress_shirt, solo, very_long_hair, barefoot, dog, hair_down, naked_shirt, pillow | | 8 | 22 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, solo, nipples, medium_breasts, nude | | 9 | 5 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | 2girls, serafuku | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | bare_shoulders | sleeveless | red_rose | anime_coloring | hair_between_eyes | upper_body | looking_at_viewer | parody | open_mouth | portrait | serafuku | shiny_hair | bangs | collarbone | blush | skirt | very_long_hair | zettai_ryouiki | black_thighhighs | thighhighs | hair_down | elbow_gloves | choker | dress | weapon | black_gloves | rain | scythe | dark | dress_shirt | barefoot | dog | naked_shirt | pillow | nipples | medium_breasts | nude | 2girls | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------------|:-------------|:-----------|:-----------------|:--------------------|:-------------|:--------------------|:---------|:-------------|:-----------|:-----------|:-------------|:--------|:-------------|:--------|:--------|:-----------------|:-----------------|:-------------------|:-------------|:------------|:---------------|:---------|:--------|:---------|:---------------|:-------|:---------|:-------|:--------------|:-----------|:------|:--------------|:---------|:----------|:-----------------|:-------|:---------| | 0 | 33 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | | | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 2 | 10 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | | | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | 3 | 6 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | | | X | X | | X | | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | 4 | 12 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | | | | | | | | | | | X | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | 5 | 6 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | X | | X | | | | | | | | | | | | | | X | X | X | | X | X | | | | | | | | | | | | | | | | | | | 6 | 13 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | | | | | | | | | | | | | | | | | X | X | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | 7 | 5 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | | | | | | | | | | | | | | | | | X | | | | X | | | | | | | | | X | X | X | X | X | | | | | | 8 | 22 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | 9 | 5 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X |
CyberHarem/lisara_restall_soicantplayh
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-02-12T17:07:07+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-02-12T17:59:57+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Lisara Restall (So, I Can't Play H!) =============================================== This is the dataset of Lisara Restall (So, I Can't Play H!), containing 426 images and their tags. The core tags of this character are 'long\_hair, red\_hair, red\_eyes', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
781230de7fa77b01eec9c021fe4fb88e24536fb7
# Dataset of Quele Sellier (So, I Can't Play H!) This is the dataset of Quele Sellier (So, I Can't Play H!), containing 224 images and their tags. The core tags of this character are `long_hair, hair_ornament, hair_flower, brown_eyes, brown_hair, blonde_hair, multicolored_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 224 | 183.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/quele_sellier_soicantplayh/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 224 | 140.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/quele_sellier_soicantplayh/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 398 | 245.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/quele_sellier_soicantplayh/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 224 | 183.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/quele_sellier_soicantplayh/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 398 | 311.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/quele_sellier_soicantplayh/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/quele_sellier_soicantplayh', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, anime_coloring, choker, bow, blue_flower, gradient_hair | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, gradient_hair, solo, profile, blue_rose | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, serafuku, solo, flower, sky, cloud, day | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, solo, horns, midriff, navel, gloves, skirt, blue_flower, pantyhose, rose, sword, very_long_hair | | 4 | 12 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | 1girl, solo, dress, long_sleeves, very_long_hair, blue_flower, gradient_hair, hair_between_eyes, sitting, breasts, purple_hair, chair, looking_at_viewer | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | anime_coloring | choker | bow | blue_flower | gradient_hair | profile | blue_rose | serafuku | flower | sky | cloud | day | horns | midriff | navel | gloves | skirt | pantyhose | rose | sword | very_long_hair | dress | long_sleeves | hair_between_eyes | sitting | breasts | purple_hair | chair | looking_at_viewer | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------------|:---------|:------|:--------------|:----------------|:----------|:------------|:-----------|:---------|:------|:--------|:------|:--------|:----------|:--------|:---------|:--------|:------------|:-------|:--------|:-----------------|:--------|:---------------|:--------------------|:----------|:----------|:--------------|:--------|:--------------------| | 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | 3 | 7 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | 4 | 12 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | | | | X | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
CyberHarem/quele_sellier_soicantplayh
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-02-12T17:07:29+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-02-12T17:31:44+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Quele Sellier (So, I Can't Play H!) ============================================== This is the dataset of Quele Sellier (So, I Can't Play H!), containing 224 images and their tags. The core tags of this character are 'long\_hair, hair\_ornament, hair\_flower, brown\_eyes, brown\_hair, blonde\_hair, multicolored\_hair', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
4d5d5d5092d4597b592bf9151356b00a7219902a
# Dataset of Iria Fukumune (So, I Can't Play H!) This is the dataset of Iria Fukumune (So, I Can't Play H!), containing 109 images and their tags. The core tags of this character are `blonde_hair, short_hair, blue_eyes, breasts, ribbon`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 109 | 98.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/iria_fukumune_soicantplayh/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 109 | 73.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/iria_fukumune_soicantplayh/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 209 | 135.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/iria_fukumune_soicantplayh/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 109 | 98.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/iria_fukumune_soicantplayh/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 209 | 170.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/iria_fukumune_soicantplayh/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/iria_fukumune_soicantplayh', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, serafuku, smile, anime_coloring, hair_between_eyes, looking_at_viewer | | 1 | 19 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, solo, hair_between_eyes, bangs, looking_at_viewer, closed_mouth, collarbone, bare_shoulders, upper_body, anime_coloring, sleeveless, portrait, smile | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | serafuku | smile | anime_coloring | hair_between_eyes | looking_at_viewer | bangs | closed_mouth | collarbone | bare_shoulders | upper_body | sleeveless | portrait | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:--------|:-----------------|:--------------------|:--------------------|:--------|:---------------|:-------------|:-----------------|:-------------|:-------------|:-----------| | 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | | | | | | | | | 1 | 19 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | X | X | X | X | X | X | X | X | X | X |
CyberHarem/iria_fukumune_soicantplayh
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-02-12T17:08:03+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-02-12T17:20:21+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Iria Fukumune (So, I Can't Play H!) ============================================== This is the dataset of Iria Fukumune (So, I Can't Play H!), containing 109 images and their tags. The core tags of this character are 'blonde\_hair, short\_hair, blue\_eyes, breasts, ribbon', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
286edf79d68331054c989e6d6ebb071bc25a0e28
# Dataset of Mina Okura (So, I Can't Play H!) This is the dataset of Mina Okura (So, I Can't Play H!), containing 167 images and their tags. The core tags of this character are `brown_hair, glasses, purple_eyes, short_hair`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 167 | 138.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mina_okura_soicantplayh/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 167 | 106.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mina_okura_soicantplayh/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 322 | 194.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mina_okura_soicantplayh/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 167 | 138.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mina_okura_soicantplayh/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 322 | 247.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mina_okura_soicantplayh/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/mina_okura_soicantplayh', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------| | 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, 2girls, large_breasts, school_uniform, smile | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, open_mouth, school_uniform, solo, large_breasts, blush | | 2 | 16 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, solo, anime_coloring, serafuku, smile | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, serafuku, skirt, solo, brown_eyes | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | 2girls | large_breasts | school_uniform | smile | open_mouth | solo | blush | anime_coloring | serafuku | skirt | brown_eyes | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:----------------|:-----------------|:--------|:-------------|:-------|:--------|:-----------------|:-----------|:--------|:-------------| | 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | | | | | | | | | 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | X | | X | X | X | | | | | | 2 | 16 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | | | X | | X | | X | X | | | | 3 | 5 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | | | | | | X | | | X | X | X |
CyberHarem/mina_okura_soicantplayh
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-02-12T17:08:53+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-02-12T17:27:47+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Mina Okura (So, I Can't Play H!) =========================================== This is the dataset of Mina Okura (So, I Can't Play H!), containing 167 images and their tags. The core tags of this character are 'brown\_hair, glasses, purple\_eyes, short\_hair', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
9e034bd1b976109eb6753b9bc4f2adef2ddf3dba
# Dataset of [Web-Based] Lisara Restall (So, I Can't Play H!) This is the dataset of [Web-Based] Lisara Restall (So, I Can't Play H!), containing 64 images and their tags. The core tags of this character are `red_hair, long_hair, red_eyes, very_long_hair, breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 64 | 82.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lisara_restall_fanart_soicantplayh/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 64 | 46.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lisara_restall_fanart_soicantplayh/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 126 | 87.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lisara_restall_fanart_soicantplayh/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 64 | 71.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lisara_restall_fanart_soicantplayh/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 126 | 128.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lisara_restall_fanart_soicantplayh/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/lisara_restall_fanart_soicantplayh', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, elbow_gloves, solo, black_gloves, black_thighhighs, dress, looking_at_viewer, horns, scythe, weapon | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, solo, blush, on_back, dakimakura_(medium), navel, nude, bed_sheet, full_body, looking_at_viewer, nipples, black_thighhighs, medium_breasts, barefoot, open_mouth, pussy, small_breasts | | 2 | 17 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, solo, blush, skirt, looking_at_viewer, black_thighhighs, smile, bare_shoulders, sleeveless_shirt, open_mouth, hair_between_eyes, red_rose, simple_background, white_background, white_shirt, zettai_ryouiki | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | elbow_gloves | solo | black_gloves | black_thighhighs | dress | looking_at_viewer | horns | scythe | weapon | blush | on_back | dakimakura_(medium) | navel | nude | bed_sheet | full_body | nipples | medium_breasts | barefoot | open_mouth | pussy | small_breasts | skirt | smile | bare_shoulders | sleeveless_shirt | hair_between_eyes | red_rose | simple_background | white_background | white_shirt | zettai_ryouiki | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------|:---------------|:-------------------|:--------|:--------------------|:--------|:---------|:---------|:--------|:----------|:----------------------|:--------|:-------|:------------|:------------|:----------|:-----------------|:-----------|:-------------|:--------|:----------------|:--------|:--------|:-----------------|:-------------------|:--------------------|:-----------|:--------------------|:-------------------|:--------------|:-----------------| | 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | 2 | 17 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | | X | | X | | | | X | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X |
CyberHarem/lisara_restall_fanart_soicantplayh
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-02-12T17:10:44+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-02-12T17:26:15+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of [Web-Based] Lisara Restall (So, I Can't Play H!) =========================================================== This is the dataset of [Web-Based] Lisara Restall (So, I Can't Play H!), containing 64 images and their tags. The core tags of this character are 'red\_hair, long\_hair, red\_eyes, very\_long\_hair, breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
b70e0caf90e9d77104b3f60413e5b13921c59163
# Dataset of Rose Oriana (Kage no Jitsuryokusha ni Naritakute!) This is the dataset of Rose Oriana (Kage no Jitsuryokusha ni Naritakute!), containing 63 images and their tags. The core tags of this character are `long_hair, blonde_hair, bangs, yellow_eyes, blunt_bangs`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 63 | 49.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rose_oriana_kagenojitsuryokushaninaritakute/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 63 | 37.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rose_oriana_kagenojitsuryokushaninaritakute/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 143 | 73.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rose_oriana_kagenojitsuryokushaninaritakute/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 63 | 49.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rose_oriana_kagenojitsuryokushaninaritakute/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 143 | 91.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/rose_oriana_kagenojitsuryokushaninaritakute/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/rose_oriana_kagenojitsuryokushaninaritakute', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, closed_mouth, profile, anime_coloring, cloud, from_side, outdoors, sky, portrait | | 1 | 13 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, solo, upper_body, closed_mouth, jacket, looking_at_viewer, shirt, frown | | 2 | 16 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, solo, very_long_hair, holding_sword, drill_hair, skirt, long_sleeves, looking_at_viewer | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | closed_mouth | profile | anime_coloring | cloud | from_side | outdoors | sky | portrait | upper_body | jacket | looking_at_viewer | shirt | frown | very_long_hair | holding_sword | drill_hair | skirt | long_sleeves | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:----------|:-----------------|:--------|:------------|:-----------|:------|:-----------|:-------------|:---------|:--------------------|:--------|:--------|:-----------------|:----------------|:-------------|:--------|:---------------| | 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | 1 | 13 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | | | | | | | | X | X | X | X | X | | | | | | | 2 | 16 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | | | | | | | | | | X | | | X | X | X | X | X |
CyberHarem/rose_oriana_kagenojitsuryokushaninaritakute
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-02-12T17:11:21+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-02-12T17:18:07+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Rose Oriana (Kage no Jitsuryokusha ni Naritakute!) ============================================================= This is the dataset of Rose Oriana (Kage no Jitsuryokusha ni Naritakute!), containing 63 images and their tags. The core tags of this character are 'long\_hair, blonde\_hair, bangs, yellow\_eyes, blunt\_bangs', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
089a21828cae9f92ddf3bfd41e7f2d3a01f5747f
# Dataset of Yashiya Yui (Rokudou no Onna-tachi) This is the dataset of Yashiya Yui (Rokudou no Onna-tachi), containing 63 images and their tags. The core tags of this character are `red_hair, long_hair, hair_over_one_eye, hair_ornament, breasts, large_breasts`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 63 | 50.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yashiya_yui_rokudounoonnatachi/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 63 | 37.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yashiya_yui_rokudounoonnatachi/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 118 | 67.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yashiya_yui_rokudounoonnatachi/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 63 | 50.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yashiya_yui_rokudounoonnatachi/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 118 | 87.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yashiya_yui_rokudounoonnatachi/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/yashiya_yui_rokudounoonnatachi', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------| | 0 | 16 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, smile, blush, brown_eyes, hairclip, open_mouth, parody, asymmetrical_bangs, looking_at_viewer, portrait | | 1 | 19 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, cleavage, tied_shirt, midriff, solo, navel, plaid_skirt, yellow_shirt, looking_at_viewer, red_eyes, red_skirt, smile | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, cleavage, hairclip, 1boy, chain-link_fence, formal, jacket, white_shirt, skirt, suit, thighhighs, zettai_ryouiki | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | blush | brown_eyes | hairclip | open_mouth | parody | asymmetrical_bangs | looking_at_viewer | portrait | cleavage | tied_shirt | midriff | navel | plaid_skirt | yellow_shirt | red_eyes | red_skirt | 1boy | chain-link_fence | formal | jacket | white_shirt | skirt | suit | thighhighs | zettai_ryouiki | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------|:-------------|:-----------|:-------------|:---------|:---------------------|:--------------------|:-----------|:-----------|:-------------|:----------|:--------|:--------------|:---------------|:-----------|:------------|:-------|:-------------------|:---------|:---------|:--------------|:--------|:-------|:-------------|:-----------------| | 0 | 16 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | 1 | 19 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | | | | | | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | | | | X | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X |
CyberHarem/yashiya_yui_rokudounoonnatachi
[ "task_categories:text-to-image", "size_categories:n<1K", "license:mit", "art", "not-for-all-audiences", "region:us" ]
2024-02-12T17:12:59+00:00
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
2024-02-12T17:19:04+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
Dataset of Yashiya Yui (Rokudou no Onna-tachi) ============================================== This is the dataset of Yashiya Yui (Rokudou no Onna-tachi), containing 63 images and their tags. The core tags of this character are 'red\_hair, long\_hair, hair\_over\_one\_eye, hair\_ornament, breasts, large\_breasts', which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization). List of Packages ---------------- ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code List of Clusters ---------------- List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version ### Table Version
[ "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n", "### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.", "### Raw Text Version", "### Table Version" ]
bf8575799910e8db2c2bff8385b64756e9601e43
0: ENOUGH_INFO 1: NOT_ENOUGH_INFO
iestynmullinor/climate_fever_reranker_training
[ "region:us" ]
2024-02-12T17:17:23+00:00
{}
2024-02-12T20:00:02+00:00
[]
[]
TAGS #region-us
0: ENOUGH_INFO 1: NOT_ENOUGH_INFO
[]
[ "TAGS\n#region-us \n" ]
94fadcc0a30f85719d84dd2965dd94355bac7c76
# Dataset Card for Evaluation run of kidyu/Moza-7B-v1.0 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [kidyu/Moza-7B-v1.0](https://huggingface.co/kidyu/Moza-7B-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_kidyu__Moza-7B-v1.0", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-12T17:21:30.595988](https://huggingface.co/datasets/open-llm-leaderboard/details_kidyu__Moza-7B-v1.0/blob/main/results_2024-02-12T17-21-30.595988.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6307241384076334, "acc_stderr": 0.03239530739049263, "acc_norm": 0.631856278602437, "acc_norm_stderr": 0.033046495454686214, "mc1": 0.4773561811505508, "mc1_stderr": 0.017485542258489646, "mc2": 0.6515628442997716, "mc2_stderr": 0.015424932543956459 }, "harness|arc:challenge|25": { "acc": 0.6348122866894198, "acc_stderr": 0.014070265519268804, "acc_norm": 0.6655290102389079, "acc_norm_stderr": 0.013787460322441372 }, "harness|hellaswag|10": { "acc": 0.6593308105954989, "acc_stderr": 0.004729656826803945, "acc_norm": 0.8344951204939255, "acc_norm_stderr": 0.003708760752685524 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.28, "acc_stderr": 0.045126085985421296, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421296 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6074074074074074, "acc_stderr": 0.0421850621536888, "acc_norm": 0.6074074074074074, "acc_norm_stderr": 0.0421850621536888 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6842105263157895, "acc_stderr": 0.0378272898086547, "acc_norm": 0.6842105263157895, "acc_norm_stderr": 0.0378272898086547 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.65, "acc_stderr": 0.0479372485441102, "acc_norm": 0.65, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.690566037735849, "acc_stderr": 0.028450154794118637, "acc_norm": 0.690566037735849, "acc_norm_stderr": 0.028450154794118637 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7430555555555556, "acc_stderr": 0.03653946969442099, "acc_norm": 0.7430555555555556, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.45, "acc_stderr": 0.05, "acc_norm": 0.45, "acc_norm_stderr": 0.05 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.51, "acc_stderr": 0.05024183937956911, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6242774566473989, "acc_stderr": 0.036928207672648664, "acc_norm": 0.6242774566473989, "acc_norm_stderr": 0.036928207672648664 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.35294117647058826, "acc_stderr": 0.047551296160629475, "acc_norm": 0.35294117647058826, "acc_norm_stderr": 0.047551296160629475 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.78, "acc_stderr": 0.04163331998932261, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932261 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5574468085106383, "acc_stderr": 0.03246956919789958, "acc_norm": 0.5574468085106383, "acc_norm_stderr": 0.03246956919789958 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4649122807017544, "acc_stderr": 0.046920083813689104, "acc_norm": 0.4649122807017544, "acc_norm_stderr": 0.046920083813689104 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.593103448275862, "acc_stderr": 0.04093793981266236, "acc_norm": 0.593103448275862, "acc_norm_stderr": 0.04093793981266236 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42328042328042326, "acc_stderr": 0.025446365634406783, "acc_norm": 0.42328042328042326, "acc_norm_stderr": 0.025446365634406783 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.42063492063492064, "acc_stderr": 0.04415438226743744, "acc_norm": 0.42063492063492064, "acc_norm_stderr": 0.04415438226743744 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7612903225806451, "acc_stderr": 0.02425107126220884, "acc_norm": 0.7612903225806451, "acc_norm_stderr": 0.02425107126220884 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03346409881055953, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7828282828282829, "acc_stderr": 0.029376616484945633, "acc_norm": 0.7828282828282829, "acc_norm_stderr": 0.029376616484945633 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8601036269430051, "acc_stderr": 0.025033870583015178, "acc_norm": 0.8601036269430051, "acc_norm_stderr": 0.025033870583015178 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6333333333333333, "acc_stderr": 0.024433016466052462, "acc_norm": 0.6333333333333333, "acc_norm_stderr": 0.024433016466052462 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.37777777777777777, "acc_stderr": 0.029560707392465718, "acc_norm": 0.37777777777777777, "acc_norm_stderr": 0.029560707392465718 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6890756302521008, "acc_stderr": 0.030066761582977938, "acc_norm": 0.6890756302521008, "acc_norm_stderr": 0.030066761582977938 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31788079470198677, "acc_stderr": 0.038020397601079024, "acc_norm": 0.31788079470198677, "acc_norm_stderr": 0.038020397601079024 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8458715596330275, "acc_stderr": 0.015480826865374303, "acc_norm": 0.8458715596330275, "acc_norm_stderr": 0.015480826865374303 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5, "acc_stderr": 0.034099716973523674, "acc_norm": 0.5, "acc_norm_stderr": 0.034099716973523674 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8088235294117647, "acc_stderr": 0.027599174300640773, "acc_norm": 0.8088235294117647, "acc_norm_stderr": 0.027599174300640773 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7932489451476793, "acc_stderr": 0.026361651668389094, "acc_norm": 0.7932489451476793, "acc_norm_stderr": 0.026361651668389094 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6681614349775785, "acc_stderr": 0.031602951437766785, "acc_norm": 0.6681614349775785, "acc_norm_stderr": 0.031602951437766785 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7480916030534351, "acc_stderr": 0.03807387116306085, "acc_norm": 0.7480916030534351, "acc_norm_stderr": 0.03807387116306085 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228733, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228733 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252627, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252627 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7116564417177914, "acc_stderr": 0.035590395316173425, "acc_norm": 0.7116564417177914, "acc_norm_stderr": 0.035590395316173425 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04697113923010212, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04697113923010212 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.040580420156460344, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.040580420156460344 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406957, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406957 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.045126085985421276, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8173690932311622, "acc_stderr": 0.013816335389973147, "acc_norm": 0.8173690932311622, "acc_norm_stderr": 0.013816335389973147 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6965317919075145, "acc_stderr": 0.02475241196091721, "acc_norm": 0.6965317919075145, "acc_norm_stderr": 0.02475241196091721 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3039106145251397, "acc_stderr": 0.015382845587584518, "acc_norm": 0.3039106145251397, "acc_norm_stderr": 0.015382845587584518 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6764705882352942, "acc_stderr": 0.026787453111906504, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.026787453111906504 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.684887459807074, "acc_stderr": 0.026385273703464485, "acc_norm": 0.684887459807074, "acc_norm_stderr": 0.026385273703464485 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7283950617283951, "acc_stderr": 0.02474862449053737, "acc_norm": 0.7283950617283951, "acc_norm_stderr": 0.02474862449053737 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.43617021276595747, "acc_stderr": 0.029583452036284066, "acc_norm": 0.43617021276595747, "acc_norm_stderr": 0.029583452036284066 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4530638852672751, "acc_stderr": 0.012713845972358981, "acc_norm": 0.4530638852672751, "acc_norm_stderr": 0.012713845972358981 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6544117647058824, "acc_stderr": 0.02888819310398863, "acc_norm": 0.6544117647058824, "acc_norm_stderr": 0.02888819310398863 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6356209150326797, "acc_stderr": 0.01946951822157369, "acc_norm": 0.6356209150326797, "acc_norm_stderr": 0.01946951822157369 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6363636363636364, "acc_stderr": 0.04607582090719976, "acc_norm": 0.6363636363636364, "acc_norm_stderr": 0.04607582090719976 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6979591836734694, "acc_stderr": 0.0293936093198798, "acc_norm": 0.6979591836734694, "acc_norm_stderr": 0.0293936093198798 }, "harness|hendrycksTest-sociology|5": { "acc": 0.835820895522388, "acc_stderr": 0.026193923544454115, "acc_norm": 0.835820895522388, "acc_norm_stderr": 0.026193923544454115 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774708, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774708 }, "harness|hendrycksTest-virology|5": { "acc": 0.5421686746987951, "acc_stderr": 0.038786267710023595, "acc_norm": 0.5421686746987951, "acc_norm_stderr": 0.038786267710023595 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.02917088550072767, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.02917088550072767 }, "harness|truthfulqa:mc|0": { "mc1": 0.4773561811505508, "mc1_stderr": 0.017485542258489646, "mc2": 0.6515628442997716, "mc2_stderr": 0.015424932543956459 }, "harness|winogrande|5": { "acc": 0.7750591949486977, "acc_stderr": 0.011735043564126735 }, "harness|gsm8k|5": { "acc": 0.6254738438210766, "acc_stderr": 0.013331774158491377 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_kidyu__Moza-7B-v1.0
[ "region:us" ]
2024-02-12T17:23:47+00:00
{"pretty_name": "Evaluation run of kidyu/Moza-7B-v1.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [kidyu/Moza-7B-v1.0](https://huggingface.co/kidyu/Moza-7B-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kidyu__Moza-7B-v1.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T17:21:30.595988](https://huggingface.co/datasets/open-llm-leaderboard/details_kidyu__Moza-7B-v1.0/blob/main/results_2024-02-12T17-21-30.595988.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6307241384076334,\n \"acc_stderr\": 0.03239530739049263,\n \"acc_norm\": 0.631856278602437,\n \"acc_norm_stderr\": 0.033046495454686214,\n \"mc1\": 0.4773561811505508,\n \"mc1_stderr\": 0.017485542258489646,\n \"mc2\": 0.6515628442997716,\n \"mc2_stderr\": 0.015424932543956459\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6348122866894198,\n \"acc_stderr\": 0.014070265519268804,\n \"acc_norm\": 0.6655290102389079,\n \"acc_norm_stderr\": 0.013787460322441372\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6593308105954989,\n \"acc_stderr\": 0.004729656826803945,\n \"acc_norm\": 0.8344951204939255,\n \"acc_norm_stderr\": 0.003708760752685524\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.025446365634406783,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.025446365634406783\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.7612903225806451,\n \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.024433016466052462,\n \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.024433016466052462\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37777777777777777,\n \"acc_stderr\": 0.029560707392465718,\n \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.029560707392465718\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977938,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977938\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640773,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640773\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n \"acc_stderr\": 0.013816335389973147,\n \"acc_norm\": 0.8173690932311622,\n \"acc_norm_stderr\": 0.013816335389973147\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.02475241196091721,\n \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.02475241196091721\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3039106145251397,\n \"acc_stderr\": 0.015382845587584518,\n \"acc_norm\": 0.3039106145251397,\n \"acc_norm_stderr\": 0.015382845587584518\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.026787453111906504,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.026787453111906504\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464485,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464485\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.43617021276595747,\n \"acc_stderr\": 0.029583452036284066,\n \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.029583452036284066\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4530638852672751,\n \"acc_stderr\": 0.012713845972358981,\n \"acc_norm\": 0.4530638852672751,\n \"acc_norm_stderr\": 0.012713845972358981\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6544117647058824,\n \"acc_stderr\": 0.02888819310398863,\n \"acc_norm\": 0.6544117647058824,\n \"acc_norm_stderr\": 0.02888819310398863\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6356209150326797,\n \"acc_stderr\": 0.01946951822157369,\n \"acc_norm\": 0.6356209150326797,\n \"acc_norm_stderr\": 0.01946951822157369\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4773561811505508,\n \"mc1_stderr\": 0.017485542258489646,\n \"mc2\": 0.6515628442997716,\n \"mc2_stderr\": 0.015424932543956459\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7750591949486977,\n \"acc_stderr\": 0.011735043564126735\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6254738438210766,\n \"acc_stderr\": 0.013331774158491377\n }\n}\n```", "repo_url": "https://huggingface.co/kidyu/Moza-7B-v1.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|arc:challenge|25_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|gsm8k|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hellaswag|10_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T17-21-30.595988.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["**/details_harness|winogrande|5_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T17-21-30.595988.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T17_21_30.595988", "path": ["results_2024-02-12T17-21-30.595988.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T17-21-30.595988.parquet"]}]}]}
2024-02-12T17:24:13+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of kidyu/Moza-7B-v1.0 Dataset automatically created during the evaluation run of model kidyu/Moza-7B-v1.0 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-12T17:21:30.595988(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of kidyu/Moza-7B-v1.0\n\n\n\nDataset automatically created during the evaluation run of model kidyu/Moza-7B-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T17:21:30.595988(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of kidyu/Moza-7B-v1.0\n\n\n\nDataset automatically created during the evaluation run of model kidyu/Moza-7B-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T17:21:30.595988(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
f1276dddb94de7240fbc6060080b1b3f478c0a7b
# Dataset Card for Evaluation run of BarraHome/Lucie-7b <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [BarraHome/Lucie-7b](https://huggingface.co/BarraHome/Lucie-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_BarraHome__Lucie-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-12T17:31:04.894348](https://huggingface.co/datasets/open-llm-leaderboard/details_BarraHome__Lucie-7b/blob/main/results_2024-02-12T17-31-04.894348.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6032184784518743, "acc_stderr": 0.03333730204729809, "acc_norm": 0.607891645213564, "acc_norm_stderr": 0.03401402537730786, "mc1": 0.5226438188494492, "mc1_stderr": 0.01748554225848964, "mc2": 0.6766513448639357, "mc2_stderr": 0.015264009667659464 }, "harness|arc:challenge|25": { "acc": 0.575938566552901, "acc_stderr": 0.014441889627464392, "acc_norm": 0.6220136518771331, "acc_norm_stderr": 0.0141696645203031 }, "harness|hellaswag|10": { "acc": 0.6612228639713205, "acc_stderr": 0.004723266971563391, "acc_norm": 0.8481378211511651, "acc_norm_stderr": 0.0035815378475817935 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5777777777777777, "acc_stderr": 0.04266763404099582, "acc_norm": 0.5777777777777777, "acc_norm_stderr": 0.04266763404099582 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.625, "acc_stderr": 0.039397364351956274, "acc_norm": 0.625, "acc_norm_stderr": 0.039397364351956274 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6754716981132075, "acc_stderr": 0.02881561571343211, "acc_norm": 0.6754716981132075, "acc_norm_stderr": 0.02881561571343211 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6597222222222222, "acc_stderr": 0.039621355734862175, "acc_norm": 0.6597222222222222, "acc_norm_stderr": 0.039621355734862175 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5838150289017341, "acc_stderr": 0.03758517775404947, "acc_norm": 0.5838150289017341, "acc_norm_stderr": 0.03758517775404947 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5148936170212766, "acc_stderr": 0.03267151848924777, "acc_norm": 0.5148936170212766, "acc_norm_stderr": 0.03267151848924777 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.43859649122807015, "acc_stderr": 0.04668000738510455, "acc_norm": 0.43859649122807015, "acc_norm_stderr": 0.04668000738510455 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5724137931034483, "acc_stderr": 0.041227371113703316, "acc_norm": 0.5724137931034483, "acc_norm_stderr": 0.041227371113703316 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.38095238095238093, "acc_stderr": 0.025010749116137602, "acc_norm": 0.38095238095238093, "acc_norm_stderr": 0.025010749116137602 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3888888888888889, "acc_stderr": 0.04360314860077459, "acc_norm": 0.3888888888888889, "acc_norm_stderr": 0.04360314860077459 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6774193548387096, "acc_stderr": 0.026593084516572277, "acc_norm": 0.6774193548387096, "acc_norm_stderr": 0.026593084516572277 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7212121212121212, "acc_stderr": 0.03501438706296781, "acc_norm": 0.7212121212121212, "acc_norm_stderr": 0.03501438706296781 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7474747474747475, "acc_stderr": 0.030954055470365897, "acc_norm": 0.7474747474747475, "acc_norm_stderr": 0.030954055470365897 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.844559585492228, "acc_stderr": 0.026148483469153314, "acc_norm": 0.844559585492228, "acc_norm_stderr": 0.026148483469153314 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5564102564102564, "acc_stderr": 0.0251891498947642, "acc_norm": 0.5564102564102564, "acc_norm_stderr": 0.0251891498947642 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.337037037037037, "acc_stderr": 0.028820884666253255, "acc_norm": 0.337037037037037, "acc_norm_stderr": 0.028820884666253255 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.634453781512605, "acc_stderr": 0.031282177063684614, "acc_norm": 0.634453781512605, "acc_norm_stderr": 0.031282177063684614 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.038796870240733264, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.038796870240733264 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8018348623853211, "acc_stderr": 0.017090573804217905, "acc_norm": 0.8018348623853211, "acc_norm_stderr": 0.017090573804217905 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4398148148148148, "acc_stderr": 0.03385177976044812, "acc_norm": 0.4398148148148148, "acc_norm_stderr": 0.03385177976044812 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7549019607843137, "acc_stderr": 0.03019028245350195, "acc_norm": 0.7549019607843137, "acc_norm_stderr": 0.03019028245350195 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7510548523206751, "acc_stderr": 0.028146970599422644, "acc_norm": 0.7510548523206751, "acc_norm_stderr": 0.028146970599422644 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6322869955156951, "acc_stderr": 0.03236198350928275, "acc_norm": 0.6322869955156951, "acc_norm_stderr": 0.03236198350928275 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6946564885496184, "acc_stderr": 0.040393149787245605, "acc_norm": 0.6946564885496184, "acc_norm_stderr": 0.040393149787245605 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098824, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098824 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7037037037037037, "acc_stderr": 0.04414343666854933, "acc_norm": 0.7037037037037037, "acc_norm_stderr": 0.04414343666854933 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7300613496932515, "acc_stderr": 0.03487825168497892, "acc_norm": 0.7300613496932515, "acc_norm_stderr": 0.03487825168497892 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.45535714285714285, "acc_stderr": 0.04726835553719099, "acc_norm": 0.45535714285714285, "acc_norm_stderr": 0.04726835553719099 }, "harness|hendrycksTest-management|5": { "acc": 0.7475728155339806, "acc_stderr": 0.04301250399690878, "acc_norm": 0.7475728155339806, "acc_norm_stderr": 0.04301250399690878 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8589743589743589, "acc_stderr": 0.022801382534597552, "acc_norm": 0.8589743589743589, "acc_norm_stderr": 0.022801382534597552 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.68, "acc_stderr": 0.046882617226215034, "acc_norm": 0.68, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7739463601532567, "acc_stderr": 0.014957458504335842, "acc_norm": 0.7739463601532567, "acc_norm_stderr": 0.014957458504335842 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6676300578034682, "acc_stderr": 0.025361168749688225, "acc_norm": 0.6676300578034682, "acc_norm_stderr": 0.025361168749688225 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.34972067039106147, "acc_stderr": 0.01594930879023364, "acc_norm": 0.34972067039106147, "acc_norm_stderr": 0.01594930879023364 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6797385620915033, "acc_stderr": 0.02671611838015685, "acc_norm": 0.6797385620915033, "acc_norm_stderr": 0.02671611838015685 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6752411575562701, "acc_stderr": 0.026596782287697043, "acc_norm": 0.6752411575562701, "acc_norm_stderr": 0.026596782287697043 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6759259259259259, "acc_stderr": 0.02604176620271716, "acc_norm": 0.6759259259259259, "acc_norm_stderr": 0.02604176620271716 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.450354609929078, "acc_stderr": 0.029680105565029036, "acc_norm": 0.450354609929078, "acc_norm_stderr": 0.029680105565029036 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.42698826597131684, "acc_stderr": 0.012633353557534427, "acc_norm": 0.42698826597131684, "acc_norm_stderr": 0.012633353557534427 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5992647058823529, "acc_stderr": 0.029768263528933105, "acc_norm": 0.5992647058823529, "acc_norm_stderr": 0.029768263528933105 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6111111111111112, "acc_stderr": 0.019722058939618068, "acc_norm": 0.6111111111111112, "acc_norm_stderr": 0.019722058939618068 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7, "acc_stderr": 0.04389311454644287, "acc_norm": 0.7, "acc_norm_stderr": 0.04389311454644287 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.0282638899437846, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.0282638899437846 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7512437810945274, "acc_stderr": 0.030567675938916714, "acc_norm": 0.7512437810945274, "acc_norm_stderr": 0.030567675938916714 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.81, "acc_stderr": 0.039427724440366255, "acc_norm": 0.81, "acc_norm_stderr": 0.039427724440366255 }, "harness|hendrycksTest-virology|5": { "acc": 0.5120481927710844, "acc_stderr": 0.03891364495835816, "acc_norm": 0.5120481927710844, "acc_norm_stderr": 0.03891364495835816 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.029170885500727665, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.029170885500727665 }, "harness|truthfulqa:mc|0": { "mc1": 0.5226438188494492, "mc1_stderr": 0.01748554225848964, "mc2": 0.6766513448639357, "mc2_stderr": 0.015264009667659464 }, "harness|winogrande|5": { "acc": 0.7679558011049724, "acc_stderr": 0.011864149691827936 }, "harness|gsm8k|5": { "acc": 0.3957543593631539, "acc_stderr": 0.013469823701048815 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_BarraHome__Lucie-7b
[ "region:us" ]
2024-02-12T17:33:24+00:00
{"pretty_name": "Evaluation run of BarraHome/Lucie-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [BarraHome/Lucie-7b](https://huggingface.co/BarraHome/Lucie-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BarraHome__Lucie-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T17:31:04.894348](https://huggingface.co/datasets/open-llm-leaderboard/details_BarraHome__Lucie-7b/blob/main/results_2024-02-12T17-31-04.894348.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6032184784518743,\n \"acc_stderr\": 0.03333730204729809,\n \"acc_norm\": 0.607891645213564,\n \"acc_norm_stderr\": 0.03401402537730786,\n \"mc1\": 0.5226438188494492,\n \"mc1_stderr\": 0.01748554225848964,\n \"mc2\": 0.6766513448639357,\n \"mc2_stderr\": 0.015264009667659464\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.575938566552901,\n \"acc_stderr\": 0.014441889627464392,\n \"acc_norm\": 0.6220136518771331,\n \"acc_norm_stderr\": 0.0141696645203031\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6612228639713205,\n \"acc_stderr\": 0.004723266971563391,\n \"acc_norm\": 0.8481378211511651,\n \"acc_norm_stderr\": 0.0035815378475817935\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.5838150289017341,\n \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.03267151848924777,\n \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.03267151848924777\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137602,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137602\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6774193548387096,\n \"acc_stderr\": 0.026593084516572277,\n \"acc_norm\": 0.6774193548387096,\n \"acc_norm_stderr\": 0.026593084516572277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153314,\n \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153314\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5564102564102564,\n \"acc_stderr\": 0.0251891498947642,\n \"acc_norm\": 0.5564102564102564,\n \"acc_norm_stderr\": 0.0251891498947642\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8018348623853211,\n \"acc_stderr\": 0.017090573804217905,\n \"acc_norm\": 0.8018348623853211,\n \"acc_norm_stderr\": 0.017090573804217905\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044812,\n \"acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044812\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.040393149787245605,\n \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.040393149787245605\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597552,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597552\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7739463601532567,\n \"acc_stderr\": 0.014957458504335842,\n \"acc_norm\": 0.7739463601532567,\n \"acc_norm_stderr\": 0.014957458504335842\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688225,\n \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688225\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34972067039106147,\n \"acc_stderr\": 0.01594930879023364,\n \"acc_norm\": 0.34972067039106147,\n \"acc_norm_stderr\": 0.01594930879023364\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.02671611838015685,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.02671611838015685\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.02604176620271716,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.02604176620271716\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42698826597131684,\n \"acc_stderr\": 0.012633353557534427,\n \"acc_norm\": 0.42698826597131684,\n \"acc_norm_stderr\": 0.012633353557534427\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.029768263528933105,\n \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.029768263528933105\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.019722058939618068,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.019722058939618068\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.7512437810945274,\n \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835816,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835816\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5226438188494492,\n \"mc1_stderr\": 0.01748554225848964,\n \"mc2\": 0.6766513448639357,\n \"mc2_stderr\": 0.015264009667659464\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7679558011049724,\n \"acc_stderr\": 0.011864149691827936\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3957543593631539,\n \"acc_stderr\": 0.013469823701048815\n }\n}\n```", "repo_url": "https://huggingface.co/BarraHome/Lucie-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|arc:challenge|25_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|gsm8k|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hellaswag|10_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T17-31-04.894348.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["**/details_harness|winogrande|5_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T17-31-04.894348.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T17_31_04.894348", "path": ["results_2024-02-12T17-31-04.894348.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T17-31-04.894348.parquet"]}]}]}
2024-02-12T17:33:49+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of BarraHome/Lucie-7b Dataset automatically created during the evaluation run of model BarraHome/Lucie-7b on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-12T17:31:04.894348(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of BarraHome/Lucie-7b\n\n\n\nDataset automatically created during the evaluation run of model BarraHome/Lucie-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T17:31:04.894348(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of BarraHome/Lucie-7b\n\n\n\nDataset automatically created during the evaluation run of model BarraHome/Lucie-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T17:31:04.894348(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
53fd34038180b525e1f0fde89a61af702b54ef61
# Dataset Card for Evaluation run of Radu1999/Mister <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Radu1999/Mister](https://huggingface.co/Radu1999/Mister) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Radu1999__Mister", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-12T17:41:45.329847](https://huggingface.co/datasets/open-llm-leaderboard/details_Radu1999__Mister/blob/main/results_2024-02-12T17-41-45.329847.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.4380766224972964, "acc_stderr": 0.03415086988103951, "acc_norm": 0.4431692648783782, "acc_norm_stderr": 0.034957532245339706, "mc1": 0.47613219094247244, "mc1_stderr": 0.017483547156961567, "mc2": 0.6585201087177726, "mc2_stderr": 0.015676724035471653 }, "harness|arc:challenge|25": { "acc": 0.5571672354948806, "acc_stderr": 0.014515573873348904, "acc_norm": 0.6168941979522184, "acc_norm_stderr": 0.014206472661672876 }, "harness|hellaswag|10": { "acc": 0.5134435371439953, "acc_stderr": 0.0049879774920421555, "acc_norm": 0.7173869747062338, "acc_norm_stderr": 0.0044934958720001285 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.24, "acc_stderr": 0.042923469599092816, "acc_norm": 0.24, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.43703703703703706, "acc_stderr": 0.04284958639753399, "acc_norm": 0.43703703703703706, "acc_norm_stderr": 0.04284958639753399 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.48026315789473684, "acc_stderr": 0.040657710025626036, "acc_norm": 0.48026315789473684, "acc_norm_stderr": 0.040657710025626036 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.4981132075471698, "acc_stderr": 0.030772653642075657, "acc_norm": 0.4981132075471698, "acc_norm_stderr": 0.030772653642075657 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.4791666666666667, "acc_stderr": 0.041775789507399935, "acc_norm": 0.4791666666666667, "acc_norm_stderr": 0.041775789507399935 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.32, "acc_stderr": 0.046882617226215034, "acc_norm": 0.32, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.4, "acc_stderr": 0.04923659639173309, "acc_norm": 0.4, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.04605661864718381, "acc_norm": 0.3, "acc_norm_stderr": 0.04605661864718381 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4508670520231214, "acc_stderr": 0.03794012674697029, "acc_norm": 0.4508670520231214, "acc_norm_stderr": 0.03794012674697029 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.27450980392156865, "acc_stderr": 0.04440521906179328, "acc_norm": 0.27450980392156865, "acc_norm_stderr": 0.04440521906179328 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.56, "acc_stderr": 0.049888765156985884, "acc_norm": 0.56, "acc_norm_stderr": 0.049888765156985884 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.425531914893617, "acc_stderr": 0.032321469162244695, "acc_norm": 0.425531914893617, "acc_norm_stderr": 0.032321469162244695 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.3157894736842105, "acc_stderr": 0.043727482902780064, "acc_norm": 0.3157894736842105, "acc_norm_stderr": 0.043727482902780064 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.41379310344827586, "acc_stderr": 0.04104269211806232, "acc_norm": 0.41379310344827586, "acc_norm_stderr": 0.04104269211806232 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3306878306878307, "acc_stderr": 0.024229965298425072, "acc_norm": 0.3306878306878307, "acc_norm_stderr": 0.024229965298425072 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2777777777777778, "acc_stderr": 0.04006168083848878, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.04006168083848878 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.28, "acc_stderr": 0.04512608598542129, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542129 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.44193548387096776, "acc_stderr": 0.02825155790684974, "acc_norm": 0.44193548387096776, "acc_norm_stderr": 0.02825155790684974 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.2955665024630542, "acc_stderr": 0.032104944337514575, "acc_norm": 0.2955665024630542, "acc_norm_stderr": 0.032104944337514575 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.23030303030303031, "acc_stderr": 0.03287666758603488, "acc_norm": 0.23030303030303031, "acc_norm_stderr": 0.03287666758603488 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6060606060606061, "acc_stderr": 0.034812853382329624, "acc_norm": 0.6060606060606061, "acc_norm_stderr": 0.034812853382329624 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.6994818652849741, "acc_stderr": 0.0330881859441575, "acc_norm": 0.6994818652849741, "acc_norm_stderr": 0.0330881859441575 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4358974358974359, "acc_stderr": 0.025141801511177495, "acc_norm": 0.4358974358974359, "acc_norm_stderr": 0.025141801511177495 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2777777777777778, "acc_stderr": 0.027309140588230196, "acc_norm": 0.2777777777777778, "acc_norm_stderr": 0.027309140588230196 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.47478991596638653, "acc_stderr": 0.0324371805513741, "acc_norm": 0.47478991596638653, "acc_norm_stderr": 0.0324371805513741 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.2913907284768212, "acc_stderr": 0.037101857261199946, "acc_norm": 0.2913907284768212, "acc_norm_stderr": 0.037101857261199946 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6201834862385321, "acc_stderr": 0.020808825617866244, "acc_norm": 0.6201834862385321, "acc_norm_stderr": 0.020808825617866244 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.2916666666666667, "acc_stderr": 0.030998666304560538, "acc_norm": 0.2916666666666667, "acc_norm_stderr": 0.030998666304560538 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.27450980392156865, "acc_stderr": 0.031321798030832904, "acc_norm": 0.27450980392156865, "acc_norm_stderr": 0.031321798030832904 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.32489451476793246, "acc_stderr": 0.030486039389105313, "acc_norm": 0.32489451476793246, "acc_norm_stderr": 0.030486039389105313 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.4798206278026906, "acc_stderr": 0.033530461674123, "acc_norm": 0.4798206278026906, "acc_norm_stderr": 0.033530461674123 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.4732824427480916, "acc_stderr": 0.04379024936553894, "acc_norm": 0.4732824427480916, "acc_norm_stderr": 0.04379024936553894 }, "harness|hendrycksTest-international_law|5": { "acc": 0.5950413223140496, "acc_stderr": 0.04481137755942469, "acc_norm": 0.5950413223140496, "acc_norm_stderr": 0.04481137755942469 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5833333333333334, "acc_stderr": 0.04766075165356461, "acc_norm": 0.5833333333333334, "acc_norm_stderr": 0.04766075165356461 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.4662576687116564, "acc_stderr": 0.039194155450484096, "acc_norm": 0.4662576687116564, "acc_norm_stderr": 0.039194155450484096 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4107142857142857, "acc_stderr": 0.04669510663875191, "acc_norm": 0.4107142857142857, "acc_norm_stderr": 0.04669510663875191 }, "harness|hendrycksTest-management|5": { "acc": 0.6504854368932039, "acc_stderr": 0.047211885060971716, "acc_norm": 0.6504854368932039, "acc_norm_stderr": 0.047211885060971716 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7051282051282052, "acc_stderr": 0.029872577708891172, "acc_norm": 0.7051282051282052, "acc_norm_stderr": 0.029872577708891172 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.5977011494252874, "acc_stderr": 0.017535294529068945, "acc_norm": 0.5977011494252874, "acc_norm_stderr": 0.017535294529068945 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.44508670520231214, "acc_stderr": 0.026756255129663776, "acc_norm": 0.44508670520231214, "acc_norm_stderr": 0.026756255129663776 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2737430167597765, "acc_stderr": 0.01491241309637243, "acc_norm": 0.2737430167597765, "acc_norm_stderr": 0.01491241309637243 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.45098039215686275, "acc_stderr": 0.02849199358617156, "acc_norm": 0.45098039215686275, "acc_norm_stderr": 0.02849199358617156 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.4983922829581994, "acc_stderr": 0.02839794490780661, "acc_norm": 0.4983922829581994, "acc_norm_stderr": 0.02839794490780661 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.4506172839506173, "acc_stderr": 0.027684721415656196, "acc_norm": 0.4506172839506173, "acc_norm_stderr": 0.027684721415656196 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3333333333333333, "acc_stderr": 0.02812163604063989, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.02812163604063989 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2920469361147327, "acc_stderr": 0.01161334913627182, "acc_norm": 0.2920469361147327, "acc_norm_stderr": 0.01161334913627182 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.3235294117647059, "acc_stderr": 0.028418208619406797, "acc_norm": 0.3235294117647059, "acc_norm_stderr": 0.028418208619406797 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.4150326797385621, "acc_stderr": 0.019933627776857418, "acc_norm": 0.4150326797385621, "acc_norm_stderr": 0.019933627776857418 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6181818181818182, "acc_stderr": 0.04653429807913508, "acc_norm": 0.6181818181818182, "acc_norm_stderr": 0.04653429807913508 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.35918367346938773, "acc_stderr": 0.030713560455108493, "acc_norm": 0.35918367346938773, "acc_norm_stderr": 0.030713560455108493 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6268656716417911, "acc_stderr": 0.03419832608176006, "acc_norm": 0.6268656716417911, "acc_norm_stderr": 0.03419832608176006 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.63, "acc_stderr": 0.04852365870939098, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939098 }, "harness|hendrycksTest-virology|5": { "acc": 0.39759036144578314, "acc_stderr": 0.03809973084540219, "acc_norm": 0.39759036144578314, "acc_norm_stderr": 0.03809973084540219 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6783625730994152, "acc_stderr": 0.03582529442573122, "acc_norm": 0.6783625730994152, "acc_norm_stderr": 0.03582529442573122 }, "harness|truthfulqa:mc|0": { "mc1": 0.47613219094247244, "mc1_stderr": 0.017483547156961567, "mc2": 0.6585201087177726, "mc2_stderr": 0.015676724035471653 }, "harness|winogrande|5": { "acc": 0.7521704814522494, "acc_stderr": 0.012134386019865348 }, "harness|gsm8k|5": { "acc": 0.08718726307808947, "acc_stderr": 0.007770691416783539 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Radu1999__Mister
[ "region:us" ]
2024-02-12T17:44:03+00:00
{"pretty_name": "Evaluation run of Radu1999/Mister", "dataset_summary": "Dataset automatically created during the evaluation run of model [Radu1999/Mister](https://huggingface.co/Radu1999/Mister) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Radu1999__Mister\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T17:41:45.329847](https://huggingface.co/datasets/open-llm-leaderboard/details_Radu1999__Mister/blob/main/results_2024-02-12T17-41-45.329847.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4380766224972964,\n \"acc_stderr\": 0.03415086988103951,\n \"acc_norm\": 0.4431692648783782,\n \"acc_norm_stderr\": 0.034957532245339706,\n \"mc1\": 0.47613219094247244,\n \"mc1_stderr\": 0.017483547156961567,\n \"mc2\": 0.6585201087177726,\n \"mc2_stderr\": 0.015676724035471653\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5571672354948806,\n \"acc_stderr\": 0.014515573873348904,\n \"acc_norm\": 0.6168941979522184,\n \"acc_norm_stderr\": 0.014206472661672876\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5134435371439953,\n \"acc_stderr\": 0.0049879774920421555,\n \"acc_norm\": 0.7173869747062338,\n \"acc_norm_stderr\": 0.0044934958720001285\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.040657710025626036,\n \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.040657710025626036\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4981132075471698,\n \"acc_stderr\": 0.030772653642075657,\n \"acc_norm\": 0.4981132075471698,\n \"acc_norm_stderr\": 0.030772653642075657\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4791666666666667,\n \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.4791666666666667,\n \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4508670520231214,\n \"acc_stderr\": 0.03794012674697029,\n \"acc_norm\": 0.4508670520231214,\n \"acc_norm_stderr\": 0.03794012674697029\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179328,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179328\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.032321469162244695,\n \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.032321469162244695\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3306878306878307,\n \"acc_stderr\": 0.024229965298425072,\n \"acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.024229965298425072\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.04006168083848878,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.04006168083848878\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.44193548387096776,\n \"acc_stderr\": 0.02825155790684974,\n \"acc_norm\": 0.44193548387096776,\n \"acc_norm_stderr\": 0.02825155790684974\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603488,\n \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603488\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.034812853382329624,\n \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.034812853382329624\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6994818652849741,\n \"acc_stderr\": 0.0330881859441575,\n \"acc_norm\": 0.6994818652849741,\n \"acc_norm_stderr\": 0.0330881859441575\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4358974358974359,\n \"acc_stderr\": 0.025141801511177495,\n \"acc_norm\": 0.4358974358974359,\n \"acc_norm_stderr\": 0.025141801511177495\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230196,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230196\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.47478991596638653,\n \"acc_stderr\": 0.0324371805513741,\n \"acc_norm\": 0.47478991596638653,\n \"acc_norm_stderr\": 0.0324371805513741\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6201834862385321,\n \"acc_stderr\": 0.020808825617866244,\n \"acc_norm\": 0.6201834862385321,\n \"acc_norm_stderr\": 0.020808825617866244\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2916666666666667,\n \"acc_stderr\": 0.030998666304560538,\n \"acc_norm\": 0.2916666666666667,\n \"acc_norm_stderr\": 0.030998666304560538\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.031321798030832904,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.031321798030832904\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.32489451476793246,\n \"acc_stderr\": 0.030486039389105313,\n \"acc_norm\": 0.32489451476793246,\n \"acc_norm_stderr\": 0.030486039389105313\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4798206278026906,\n \"acc_stderr\": 0.033530461674123,\n \"acc_norm\": 0.4798206278026906,\n \"acc_norm_stderr\": 0.033530461674123\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.4732824427480916,\n \"acc_stderr\": 0.04379024936553894,\n \"acc_norm\": 0.4732824427480916,\n \"acc_norm_stderr\": 0.04379024936553894\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5950413223140496,\n \"acc_stderr\": 0.04481137755942469,\n \"acc_norm\": 0.5950413223140496,\n \"acc_norm_stderr\": 0.04481137755942469\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4662576687116564,\n \"acc_stderr\": 0.039194155450484096,\n \"acc_norm\": 0.4662576687116564,\n \"acc_norm_stderr\": 0.039194155450484096\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.047211885060971716,\n \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.047211885060971716\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7051282051282052,\n \"acc_stderr\": 0.029872577708891172,\n \"acc_norm\": 0.7051282051282052,\n \"acc_norm_stderr\": 0.029872577708891172\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5977011494252874,\n \"acc_stderr\": 0.017535294529068945,\n \"acc_norm\": 0.5977011494252874,\n \"acc_norm_stderr\": 0.017535294529068945\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.44508670520231214,\n \"acc_stderr\": 0.026756255129663776,\n \"acc_norm\": 0.44508670520231214,\n \"acc_norm_stderr\": 0.026756255129663776\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2737430167597765,\n \"acc_stderr\": 0.01491241309637243,\n \"acc_norm\": 0.2737430167597765,\n \"acc_norm_stderr\": 0.01491241309637243\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.02849199358617156,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.02849199358617156\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4983922829581994,\n \"acc_stderr\": 0.02839794490780661,\n \"acc_norm\": 0.4983922829581994,\n \"acc_norm_stderr\": 0.02839794490780661\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4506172839506173,\n \"acc_stderr\": 0.027684721415656196,\n \"acc_norm\": 0.4506172839506173,\n \"acc_norm_stderr\": 0.027684721415656196\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02812163604063989,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02812163604063989\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2920469361147327,\n \"acc_stderr\": 0.01161334913627182,\n \"acc_norm\": 0.2920469361147327,\n \"acc_norm_stderr\": 0.01161334913627182\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.028418208619406797,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.028418208619406797\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4150326797385621,\n \"acc_stderr\": 0.019933627776857418,\n \"acc_norm\": 0.4150326797385621,\n \"acc_norm_stderr\": 0.019933627776857418\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.04653429807913508,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.04653429807913508\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.35918367346938773,\n \"acc_stderr\": 0.030713560455108493,\n \"acc_norm\": 0.35918367346938773,\n \"acc_norm_stderr\": 0.030713560455108493\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6268656716417911,\n \"acc_stderr\": 0.03419832608176006,\n \"acc_norm\": 0.6268656716417911,\n \"acc_norm_stderr\": 0.03419832608176006\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n \"acc_stderr\": 0.03809973084540219,\n \"acc_norm\": 0.39759036144578314,\n \"acc_norm_stderr\": 0.03809973084540219\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47613219094247244,\n \"mc1_stderr\": 0.017483547156961567,\n \"mc2\": 0.6585201087177726,\n \"mc2_stderr\": 0.015676724035471653\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7521704814522494,\n \"acc_stderr\": 0.012134386019865348\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08718726307808947,\n \"acc_stderr\": 0.007770691416783539\n }\n}\n```", "repo_url": "https://huggingface.co/Radu1999/Mister", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|arc:challenge|25_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|gsm8k|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hellaswag|10_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T17-41-45.329847.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["**/details_harness|winogrande|5_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T17-41-45.329847.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T17_41_45.329847", "path": ["results_2024-02-12T17-41-45.329847.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T17-41-45.329847.parquet"]}]}]}
2024-02-12T17:44:31+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Radu1999/Mister Dataset automatically created during the evaluation run of model Radu1999/Mister on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-12T17:41:45.329847(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Radu1999/Mister\n\n\n\nDataset automatically created during the evaluation run of model Radu1999/Mister on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T17:41:45.329847(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Radu1999/Mister\n\n\n\nDataset automatically created during the evaluation run of model Radu1999/Mister on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T17:41:45.329847(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
4eff1224df969e2ed3e897eae06f7ddf7e0c58d8
# Dataset Card for "fashion_image_caption-100-v2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
leebissessar5/fashion_image_caption-100-v2
[ "region:us" ]
2024-02-12T17:52:33+00:00
{"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 22842342.0, "num_examples": 100}], "download_size": 22823707, "dataset_size": 22842342.0}}
2024-02-12T17:52:35+00:00
[]
[]
TAGS #region-us
# Dataset Card for "fashion_image_caption-100-v2" More Information needed
[ "# Dataset Card for \"fashion_image_caption-100-v2\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"fashion_image_caption-100-v2\"\n\nMore Information needed" ]
e5e53ff76a327b3001e61071347d7d5974e99700
# Dataset Card for Evaluation run of Qwen/Qwen1.5-4B-Chat <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Qwen/Qwen1.5-4B-Chat](https://huggingface.co/Qwen/Qwen1.5-4B-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Qwen__Qwen1.5-4B-Chat", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-12T18:02:11.797174](https://huggingface.co/datasets/open-llm-leaderboard/details_Qwen__Qwen1.5-4B-Chat/blob/main/results_2024-02-12T18-02-11.797174.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5452693175035633, "acc_stderr": 0.033961232333256236, "acc_norm": 0.555864434515964, "acc_norm_stderr": 0.03480885079816102, "mc1": 0.29253365973072215, "mc1_stderr": 0.015925597445286165, "mc2": 0.4479029862950014, "mc2_stderr": 0.015185042808380176 }, "harness|arc:challenge|25": { "acc": 0.4044368600682594, "acc_stderr": 0.014342036483436175, "acc_norm": 0.4325938566552901, "acc_norm_stderr": 0.014478005694182528 }, "harness|hellaswag|10": { "acc": 0.5170284803823939, "acc_stderr": 0.004986886806565644, "acc_norm": 0.6972714598685521, "acc_norm_stderr": 0.004584997935360418 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.43703703703703706, "acc_stderr": 0.04284958639753399, "acc_norm": 0.43703703703703706, "acc_norm_stderr": 0.04284958639753399 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5986842105263158, "acc_stderr": 0.039889037033362836, "acc_norm": 0.5986842105263158, "acc_norm_stderr": 0.039889037033362836 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5962264150943396, "acc_stderr": 0.03019761160019795, "acc_norm": 0.5962264150943396, "acc_norm_stderr": 0.03019761160019795 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5138888888888888, "acc_stderr": 0.04179596617581, "acc_norm": 0.5138888888888888, "acc_norm_stderr": 0.04179596617581 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5433526011560693, "acc_stderr": 0.03798106566014498, "acc_norm": 0.5433526011560693, "acc_norm_stderr": 0.03798106566014498 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.3431372549019608, "acc_stderr": 0.04724007352383888, "acc_norm": 0.3431372549019608, "acc_norm_stderr": 0.04724007352383888 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.72, "acc_stderr": 0.045126085985421276, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.44680851063829785, "acc_stderr": 0.0325005368436584, "acc_norm": 0.44680851063829785, "acc_norm_stderr": 0.0325005368436584 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.35964912280701755, "acc_stderr": 0.045144961328736334, "acc_norm": 0.35964912280701755, "acc_norm_stderr": 0.045144961328736334 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.041443118108781526, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.041443118108781526 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4312169312169312, "acc_stderr": 0.0255064816981382, "acc_norm": 0.4312169312169312, "acc_norm_stderr": 0.0255064816981382 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3253968253968254, "acc_stderr": 0.04190596438871136, "acc_norm": 0.3253968253968254, "acc_norm_stderr": 0.04190596438871136 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6612903225806451, "acc_stderr": 0.026923446059302837, "acc_norm": 0.6612903225806451, "acc_norm_stderr": 0.026923446059302837 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.43842364532019706, "acc_stderr": 0.03491207857486519, "acc_norm": 0.43842364532019706, "acc_norm_stderr": 0.03491207857486519 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.6, "acc_stderr": 0.04923659639173309, "acc_norm": 0.6, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6666666666666666, "acc_stderr": 0.036810508691615486, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.036810508691615486 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7575757575757576, "acc_stderr": 0.030532892233932022, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.030532892233932022 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7668393782383419, "acc_stderr": 0.03051611137147602, "acc_norm": 0.7668393782383419, "acc_norm_stderr": 0.03051611137147602 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5256410256410257, "acc_stderr": 0.025317649726448663, "acc_norm": 0.5256410256410257, "acc_norm_stderr": 0.025317649726448663 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.36666666666666664, "acc_stderr": 0.029381620726465076, "acc_norm": 0.36666666666666664, "acc_norm_stderr": 0.029381620726465076 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5252100840336135, "acc_stderr": 0.0324371805513741, "acc_norm": 0.5252100840336135, "acc_norm_stderr": 0.0324371805513741 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.708256880733945, "acc_stderr": 0.01948930096887653, "acc_norm": 0.708256880733945, "acc_norm_stderr": 0.01948930096887653 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.46296296296296297, "acc_stderr": 0.03400603625538272, "acc_norm": 0.46296296296296297, "acc_norm_stderr": 0.03400603625538272 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6911764705882353, "acc_stderr": 0.03242661719827218, "acc_norm": 0.6911764705882353, "acc_norm_stderr": 0.03242661719827218 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7130801687763713, "acc_stderr": 0.02944377302259469, "acc_norm": 0.7130801687763713, "acc_norm_stderr": 0.02944377302259469 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5650224215246636, "acc_stderr": 0.033272833702713445, "acc_norm": 0.5650224215246636, "acc_norm_stderr": 0.033272833702713445 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5801526717557252, "acc_stderr": 0.043285772152629715, "acc_norm": 0.5801526717557252, "acc_norm_stderr": 0.043285772152629715 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7272727272727273, "acc_stderr": 0.04065578140908705, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.04065578140908705 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7037037037037037, "acc_stderr": 0.044143436668549335, "acc_norm": 0.7037037037037037, "acc_norm_stderr": 0.044143436668549335 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6257668711656442, "acc_stderr": 0.03802068102899616, "acc_norm": 0.6257668711656442, "acc_norm_stderr": 0.03802068102899616 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4107142857142857, "acc_stderr": 0.04669510663875191, "acc_norm": 0.4107142857142857, "acc_norm_stderr": 0.04669510663875191 }, "harness|hendrycksTest-management|5": { "acc": 0.6796116504854369, "acc_stderr": 0.04620284082280042, "acc_norm": 0.6796116504854369, "acc_norm_stderr": 0.04620284082280042 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8376068376068376, "acc_stderr": 0.02416161812798774, "acc_norm": 0.8376068376068376, "acc_norm_stderr": 0.02416161812798774 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7100893997445722, "acc_stderr": 0.01622501794477096, "acc_norm": 0.7100893997445722, "acc_norm_stderr": 0.01622501794477096 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6647398843930635, "acc_stderr": 0.025416003773165555, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.025416003773165555 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.26033519553072626, "acc_stderr": 0.014676252009319478, "acc_norm": 0.26033519553072626, "acc_norm_stderr": 0.014676252009319478 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6045751633986928, "acc_stderr": 0.02799672318063145, "acc_norm": 0.6045751633986928, "acc_norm_stderr": 0.02799672318063145 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5819935691318328, "acc_stderr": 0.028013651891995072, "acc_norm": 0.5819935691318328, "acc_norm_stderr": 0.028013651891995072 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5617283950617284, "acc_stderr": 0.027607914087400473, "acc_norm": 0.5617283950617284, "acc_norm_stderr": 0.027607914087400473 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4326241134751773, "acc_stderr": 0.029555454236778852, "acc_norm": 0.4326241134751773, "acc_norm_stderr": 0.029555454236778852 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.40808344198174706, "acc_stderr": 0.012552598958563662, "acc_norm": 0.40808344198174706, "acc_norm_stderr": 0.012552598958563662 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.48161764705882354, "acc_stderr": 0.030352303395351964, "acc_norm": 0.48161764705882354, "acc_norm_stderr": 0.030352303395351964 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5375816993464052, "acc_stderr": 0.02017061497496976, "acc_norm": 0.5375816993464052, "acc_norm_stderr": 0.02017061497496976 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6272727272727273, "acc_stderr": 0.04631381319425465, "acc_norm": 0.6272727272727273, "acc_norm_stderr": 0.04631381319425465 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6938775510204082, "acc_stderr": 0.029504896454595957, "acc_norm": 0.6938775510204082, "acc_norm_stderr": 0.029504896454595957 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7114427860696517, "acc_stderr": 0.03203841040213321, "acc_norm": 0.7114427860696517, "acc_norm_stderr": 0.03203841040213321 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.81, "acc_stderr": 0.039427724440366234, "acc_norm": 0.81, "acc_norm_stderr": 0.039427724440366234 }, "harness|hendrycksTest-virology|5": { "acc": 0.4759036144578313, "acc_stderr": 0.038879718495972646, "acc_norm": 0.4759036144578313, "acc_norm_stderr": 0.038879718495972646 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.672514619883041, "acc_stderr": 0.035993357714560276, "acc_norm": 0.672514619883041, "acc_norm_stderr": 0.035993357714560276 }, "harness|truthfulqa:mc|0": { "mc1": 0.29253365973072215, "mc1_stderr": 0.015925597445286165, "mc2": 0.4479029862950014, "mc2_stderr": 0.015185042808380176 }, "harness|winogrande|5": { "acc": 0.6495659037095501, "acc_stderr": 0.013409047676670185 }, "harness|gsm8k|5": { "acc": 0.024260803639120546, "acc_stderr": 0.0042380079000014035 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Qwen__Qwen1.5-4B-Chat
[ "region:us" ]
2024-02-12T18:04:16+00:00
{"pretty_name": "Evaluation run of Qwen/Qwen1.5-4B-Chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [Qwen/Qwen1.5-4B-Chat](https://huggingface.co/Qwen/Qwen1.5-4B-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Qwen__Qwen1.5-4B-Chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T18:02:11.797174](https://huggingface.co/datasets/open-llm-leaderboard/details_Qwen__Qwen1.5-4B-Chat/blob/main/results_2024-02-12T18-02-11.797174.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5452693175035633,\n \"acc_stderr\": 0.033961232333256236,\n \"acc_norm\": 0.555864434515964,\n \"acc_norm_stderr\": 0.03480885079816102,\n \"mc1\": 0.29253365973072215,\n \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.4479029862950014,\n \"mc2_stderr\": 0.015185042808380176\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4044368600682594,\n \"acc_stderr\": 0.014342036483436175,\n \"acc_norm\": 0.4325938566552901,\n \"acc_norm_stderr\": 0.014478005694182528\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5170284803823939,\n \"acc_stderr\": 0.004986886806565644,\n \"acc_norm\": 0.6972714598685521,\n \"acc_norm_stderr\": 0.004584997935360418\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.039889037033362836,\n \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.039889037033362836\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.04179596617581,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.04179596617581\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.041443118108781526,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.041443118108781526\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.0255064816981382,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.0255064816981382\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n \"acc_stderr\": 0.04190596438871136,\n \"acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.04190596438871136\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6612903225806451,\n \"acc_stderr\": 0.026923446059302837,\n \"acc_norm\": 0.6612903225806451,\n \"acc_norm_stderr\": 0.026923446059302837\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486519,\n \"acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486519\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.036810508691615486,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.036810508691615486\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932022,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932022\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7668393782383419,\n \"acc_stderr\": 0.03051611137147602,\n \"acc_norm\": 0.7668393782383419,\n \"acc_norm_stderr\": 0.03051611137147602\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5256410256410257,\n \"acc_stderr\": 0.025317649726448663,\n \"acc_norm\": 0.5256410256410257,\n \"acc_norm_stderr\": 0.025317649726448663\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465076,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465076\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.0324371805513741,\n \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.0324371805513741\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.708256880733945,\n \"acc_stderr\": 0.01948930096887653,\n \"acc_norm\": 0.708256880733945,\n \"acc_norm_stderr\": 0.01948930096887653\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.03242661719827218,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.03242661719827218\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7130801687763713,\n \"acc_stderr\": 0.02944377302259469,\n \"acc_norm\": 0.7130801687763713,\n \"acc_norm_stderr\": 0.02944377302259469\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5650224215246636,\n \"acc_stderr\": 0.033272833702713445,\n \"acc_norm\": 0.5650224215246636,\n \"acc_norm_stderr\": 0.033272833702713445\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.043285772152629715,\n \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.043285772152629715\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6257668711656442,\n \"acc_stderr\": 0.03802068102899616,\n \"acc_norm\": 0.6257668711656442,\n \"acc_norm_stderr\": 0.03802068102899616\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280042,\n \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280042\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7100893997445722,\n \"acc_stderr\": 0.01622501794477096,\n \"acc_norm\": 0.7100893997445722,\n \"acc_norm_stderr\": 0.01622501794477096\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.025416003773165555,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.025416003773165555\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26033519553072626,\n \"acc_stderr\": 0.014676252009319478,\n \"acc_norm\": 0.26033519553072626,\n \"acc_norm_stderr\": 0.014676252009319478\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.02799672318063145,\n \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.02799672318063145\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5819935691318328,\n \"acc_stderr\": 0.028013651891995072,\n \"acc_norm\": 0.5819935691318328,\n \"acc_norm_stderr\": 0.028013651891995072\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5617283950617284,\n \"acc_stderr\": 0.027607914087400473,\n \"acc_norm\": 0.5617283950617284,\n \"acc_norm_stderr\": 0.027607914087400473\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4326241134751773,\n \"acc_stderr\": 0.029555454236778852,\n \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.029555454236778852\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40808344198174706,\n \"acc_stderr\": 0.012552598958563662,\n \"acc_norm\": 0.40808344198174706,\n \"acc_norm_stderr\": 0.012552598958563662\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.48161764705882354,\n \"acc_stderr\": 0.030352303395351964,\n \"acc_norm\": 0.48161764705882354,\n \"acc_norm_stderr\": 0.030352303395351964\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5375816993464052,\n \"acc_stderr\": 0.02017061497496976,\n \"acc_norm\": 0.5375816993464052,\n \"acc_norm_stderr\": 0.02017061497496976\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.029504896454595957,\n \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.029504896454595957\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7114427860696517,\n \"acc_stderr\": 0.03203841040213321,\n \"acc_norm\": 0.7114427860696517,\n \"acc_norm_stderr\": 0.03203841040213321\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.672514619883041,\n \"acc_stderr\": 0.035993357714560276,\n \"acc_norm\": 0.672514619883041,\n \"acc_norm_stderr\": 0.035993357714560276\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29253365973072215,\n \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.4479029862950014,\n \"mc2_stderr\": 0.015185042808380176\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6495659037095501,\n \"acc_stderr\": 0.013409047676670185\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.024260803639120546,\n \"acc_stderr\": 0.0042380079000014035\n }\n}\n```", "repo_url": "https://huggingface.co/Qwen/Qwen1.5-4B-Chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|arc:challenge|25_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|gsm8k|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hellaswag|10_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T18-02-11.797174.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["**/details_harness|winogrande|5_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T18-02-11.797174.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T18_02_11.797174", "path": ["results_2024-02-12T18-02-11.797174.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T18-02-11.797174.parquet"]}]}]}
2024-02-12T18:04:41+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Qwen/Qwen1.5-4B-Chat Dataset automatically created during the evaluation run of model Qwen/Qwen1.5-4B-Chat on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-12T18:02:11.797174(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Qwen/Qwen1.5-4B-Chat\n\n\n\nDataset automatically created during the evaluation run of model Qwen/Qwen1.5-4B-Chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T18:02:11.797174(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Qwen/Qwen1.5-4B-Chat\n\n\n\nDataset automatically created during the evaluation run of model Qwen/Qwen1.5-4B-Chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T18:02:11.797174(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
4210f3597640caefce868fa03ae154831f77c557
# Dataset Card for Evaluation run of Qwen/Qwen1.5-7B-Chat <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Qwen/Qwen1.5-7B-Chat](https://huggingface.co/Qwen/Qwen1.5-7B-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Qwen__Qwen1.5-7B-Chat", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-12T18:09:38.337578](https://huggingface.co/datasets/open-llm-leaderboard/details_Qwen__Qwen1.5-7B-Chat/blob/main/results_2024-02-12T18-09-38.337578.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.607687110062688, "acc_stderr": 0.033077176361281796, "acc_norm": 0.6183668690016029, "acc_norm_stderr": 0.033800712564644564, "mc1": 0.36107711138310894, "mc1_stderr": 0.016814312844836886, "mc2": 0.5354325822354306, "mc2_stderr": 0.01599058702795668 }, "harness|arc:challenge|25": { "acc": 0.523037542662116, "acc_stderr": 0.014595873205358266, "acc_norm": 0.5588737201365188, "acc_norm_stderr": 0.014509747749064663 }, "harness|hellaswag|10": { "acc": 0.5938060147381, "acc_stderr": 0.0049011789179008464, "acc_norm": 0.7856004779924318, "acc_norm_stderr": 0.004095663731959211 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4888888888888889, "acc_stderr": 0.04318275491977976, "acc_norm": 0.4888888888888889, "acc_norm_stderr": 0.04318275491977976 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6776315789473685, "acc_stderr": 0.03803510248351585, "acc_norm": 0.6776315789473685, "acc_norm_stderr": 0.03803510248351585 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.67, "acc_stderr": 0.04725815626252607, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252607 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6867924528301886, "acc_stderr": 0.02854479331905533, "acc_norm": 0.6867924528301886, "acc_norm_stderr": 0.02854479331905533 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7083333333333334, "acc_stderr": 0.03800968060554858, "acc_norm": 0.7083333333333334, "acc_norm_stderr": 0.03800968060554858 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.59, "acc_stderr": 0.049431107042371025, "acc_norm": 0.59, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6069364161849711, "acc_stderr": 0.037242495958177295, "acc_norm": 0.6069364161849711, "acc_norm_stderr": 0.037242495958177295 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4215686274509804, "acc_stderr": 0.04913595201274498, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.04913595201274498 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5319148936170213, "acc_stderr": 0.03261936918467382, "acc_norm": 0.5319148936170213, "acc_norm_stderr": 0.03261936918467382 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4473684210526316, "acc_stderr": 0.04677473004491199, "acc_norm": 0.4473684210526316, "acc_norm_stderr": 0.04677473004491199 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6413793103448275, "acc_stderr": 0.039966295748767186, "acc_norm": 0.6413793103448275, "acc_norm_stderr": 0.039966295748767186 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.47354497354497355, "acc_stderr": 0.02571523981134676, "acc_norm": 0.47354497354497355, "acc_norm_stderr": 0.02571523981134676 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5158730158730159, "acc_stderr": 0.044698818540726076, "acc_norm": 0.5158730158730159, "acc_norm_stderr": 0.044698818540726076 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.0479372485441102, "acc_norm": 0.35, "acc_norm_stderr": 0.0479372485441102 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7516129032258064, "acc_stderr": 0.024580028921481003, "acc_norm": 0.7516129032258064, "acc_norm_stderr": 0.024580028921481003 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5714285714285714, "acc_stderr": 0.03481904844438803, "acc_norm": 0.5714285714285714, "acc_norm_stderr": 0.03481904844438803 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7515151515151515, "acc_stderr": 0.033744026441394036, "acc_norm": 0.7515151515151515, "acc_norm_stderr": 0.033744026441394036 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.797979797979798, "acc_stderr": 0.028606204289229865, "acc_norm": 0.797979797979798, "acc_norm_stderr": 0.028606204289229865 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7979274611398963, "acc_stderr": 0.02897908979429673, "acc_norm": 0.7979274611398963, "acc_norm_stderr": 0.02897908979429673 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5974358974358974, "acc_stderr": 0.02486499515976775, "acc_norm": 0.5974358974358974, "acc_norm_stderr": 0.02486499515976775 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2962962962962963, "acc_stderr": 0.027840811495871923, "acc_norm": 0.2962962962962963, "acc_norm_stderr": 0.027840811495871923 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6512605042016807, "acc_stderr": 0.030956636328566545, "acc_norm": 0.6512605042016807, "acc_norm_stderr": 0.030956636328566545 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8110091743119267, "acc_stderr": 0.016785481159203613, "acc_norm": 0.8110091743119267, "acc_norm_stderr": 0.016785481159203613 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4583333333333333, "acc_stderr": 0.03398110890294636, "acc_norm": 0.4583333333333333, "acc_norm_stderr": 0.03398110890294636 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7990196078431373, "acc_stderr": 0.02812597226565437, "acc_norm": 0.7990196078431373, "acc_norm_stderr": 0.02812597226565437 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7679324894514767, "acc_stderr": 0.027479744550808503, "acc_norm": 0.7679324894514767, "acc_norm_stderr": 0.027479744550808503 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6412556053811659, "acc_stderr": 0.032190792004199956, "acc_norm": 0.6412556053811659, "acc_norm_stderr": 0.032190792004199956 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7022900763358778, "acc_stderr": 0.040103589424622034, "acc_norm": 0.7022900763358778, "acc_norm_stderr": 0.040103589424622034 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7603305785123967, "acc_stderr": 0.03896878985070416, "acc_norm": 0.7603305785123967, "acc_norm_stderr": 0.03896878985070416 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6380368098159509, "acc_stderr": 0.037757007291414416, "acc_norm": 0.6380368098159509, "acc_norm_stderr": 0.037757007291414416 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.375, "acc_stderr": 0.04595091388086298, "acc_norm": 0.375, "acc_norm_stderr": 0.04595091388086298 }, "harness|hendrycksTest-management|5": { "acc": 0.7378640776699029, "acc_stderr": 0.04354631077260595, "acc_norm": 0.7378640776699029, "acc_norm_stderr": 0.04354631077260595 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8675213675213675, "acc_stderr": 0.022209309073165616, "acc_norm": 0.8675213675213675, "acc_norm_stderr": 0.022209309073165616 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.768837803320562, "acc_stderr": 0.015075523238101083, "acc_norm": 0.768837803320562, "acc_norm_stderr": 0.015075523238101083 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6705202312138728, "acc_stderr": 0.025305258131879713, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.025305258131879713 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.38212290502793295, "acc_stderr": 0.01625113971157077, "acc_norm": 0.38212290502793295, "acc_norm_stderr": 0.01625113971157077 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7026143790849673, "acc_stderr": 0.026173908506718576, "acc_norm": 0.7026143790849673, "acc_norm_stderr": 0.026173908506718576 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7202572347266881, "acc_stderr": 0.02549425935069491, "acc_norm": 0.7202572347266881, "acc_norm_stderr": 0.02549425935069491 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6604938271604939, "acc_stderr": 0.026348564412011624, "acc_norm": 0.6604938271604939, "acc_norm_stderr": 0.026348564412011624 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.39361702127659576, "acc_stderr": 0.02914454478159616, "acc_norm": 0.39361702127659576, "acc_norm_stderr": 0.02914454478159616 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46153846153846156, "acc_stderr": 0.012732398286190444, "acc_norm": 0.46153846153846156, "acc_norm_stderr": 0.012732398286190444 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5882352941176471, "acc_stderr": 0.029896163033125474, "acc_norm": 0.5882352941176471, "acc_norm_stderr": 0.029896163033125474 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5816993464052288, "acc_stderr": 0.019955975145835546, "acc_norm": 0.5816993464052288, "acc_norm_stderr": 0.019955975145835546 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6090909090909091, "acc_stderr": 0.04673752333670239, "acc_norm": 0.6090909090909091, "acc_norm_stderr": 0.04673752333670239 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6938775510204082, "acc_stderr": 0.02950489645459596, "acc_norm": 0.6938775510204082, "acc_norm_stderr": 0.02950489645459596 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7611940298507462, "acc_stderr": 0.03014777593540922, "acc_norm": 0.7611940298507462, "acc_norm_stderr": 0.03014777593540922 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774708, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774708 }, "harness|hendrycksTest-virology|5": { "acc": 0.4759036144578313, "acc_stderr": 0.03887971849597264, "acc_norm": 0.4759036144578313, "acc_norm_stderr": 0.03887971849597264 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7602339181286549, "acc_stderr": 0.03274485211946956, "acc_norm": 0.7602339181286549, "acc_norm_stderr": 0.03274485211946956 }, "harness|truthfulqa:mc|0": { "mc1": 0.36107711138310894, "mc1_stderr": 0.016814312844836886, "mc2": 0.5354325822354306, "mc2_stderr": 0.01599058702795668 }, "harness|winogrande|5": { "acc": 0.6771902131018153, "acc_stderr": 0.013140498173357952 }, "harness|gsm8k|5": { "acc": 0.13570887035633056, "acc_stderr": 0.009433577908567334 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Qwen__Qwen1.5-7B-Chat
[ "region:us" ]
2024-02-12T18:11:45+00:00
{"pretty_name": "Evaluation run of Qwen/Qwen1.5-7B-Chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [Qwen/Qwen1.5-7B-Chat](https://huggingface.co/Qwen/Qwen1.5-7B-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Qwen__Qwen1.5-7B-Chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T18:09:38.337578](https://huggingface.co/datasets/open-llm-leaderboard/details_Qwen__Qwen1.5-7B-Chat/blob/main/results_2024-02-12T18-09-38.337578.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.607687110062688,\n \"acc_stderr\": 0.033077176361281796,\n \"acc_norm\": 0.6183668690016029,\n \"acc_norm_stderr\": 0.033800712564644564,\n \"mc1\": 0.36107711138310894,\n \"mc1_stderr\": 0.016814312844836886,\n \"mc2\": 0.5354325822354306,\n \"mc2_stderr\": 0.01599058702795668\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.523037542662116,\n \"acc_stderr\": 0.014595873205358266,\n \"acc_norm\": 0.5588737201365188,\n \"acc_norm_stderr\": 0.014509747749064663\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5938060147381,\n \"acc_stderr\": 0.0049011789179008464,\n \"acc_norm\": 0.7856004779924318,\n \"acc_norm_stderr\": 0.004095663731959211\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.02854479331905533,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.02854479331905533\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n \"acc_stderr\": 0.03800968060554858,\n \"acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.03800968060554858\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.037242495958177295,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.037242495958177295\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.039966295748767186,\n \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.039966295748767186\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.47354497354497355,\n \"acc_stderr\": 0.02571523981134676,\n \"acc_norm\": 0.47354497354497355,\n \"acc_norm_stderr\": 0.02571523981134676\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.7516129032258064,\n \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.03481904844438803,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.03481904844438803\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229865,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229865\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5974358974358974,\n \"acc_stderr\": 0.02486499515976775,\n \"acc_norm\": 0.5974358974358974,\n \"acc_norm_stderr\": 0.02486499515976775\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871923,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871923\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566545,\n \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566545\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8110091743119267,\n \"acc_stderr\": 0.016785481159203613,\n \"acc_norm\": 0.8110091743119267,\n \"acc_norm_stderr\": 0.016785481159203613\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565437,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565437\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808503,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808503\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.6412556053811659,\n \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.768837803320562,\n \"acc_stderr\": 0.015075523238101083,\n \"acc_norm\": 0.768837803320562,\n \"acc_norm_stderr\": 0.015075523238101083\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.025305258131879713,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.025305258131879713\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38212290502793295,\n \"acc_stderr\": 0.01625113971157077,\n \"acc_norm\": 0.38212290502793295,\n \"acc_norm_stderr\": 0.01625113971157077\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.026173908506718576,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.026173908506718576\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6604938271604939,\n \"acc_stderr\": 0.026348564412011624,\n \"acc_norm\": 0.6604938271604939,\n \"acc_norm_stderr\": 0.026348564412011624\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.39361702127659576,\n \"acc_stderr\": 0.02914454478159616,\n \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.02914454478159616\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46153846153846156,\n \"acc_stderr\": 0.012732398286190444,\n \"acc_norm\": 0.46153846153846156,\n \"acc_norm_stderr\": 0.012732398286190444\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.029896163033125474,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.029896163033125474\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5816993464052288,\n \"acc_stderr\": 0.019955975145835546,\n \"acc_norm\": 0.5816993464052288,\n \"acc_norm_stderr\": 0.019955975145835546\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.02950489645459596,\n \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.02950489645459596\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7611940298507462,\n \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.7611940298507462,\n \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36107711138310894,\n \"mc1_stderr\": 0.016814312844836886,\n \"mc2\": 0.5354325822354306,\n \"mc2_stderr\": 0.01599058702795668\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6771902131018153,\n \"acc_stderr\": 0.013140498173357952\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13570887035633056,\n \"acc_stderr\": 0.009433577908567334\n }\n}\n```", "repo_url": "https://huggingface.co/Qwen/Qwen1.5-7B-Chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|arc:challenge|25_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|gsm8k|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hellaswag|10_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T18-09-38.337578.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["**/details_harness|winogrande|5_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T18-09-38.337578.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T18_09_38.337578", "path": ["results_2024-02-12T18-09-38.337578.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T18-09-38.337578.parquet"]}]}]}
2024-02-12T18:12:10+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Qwen/Qwen1.5-7B-Chat Dataset automatically created during the evaluation run of model Qwen/Qwen1.5-7B-Chat on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-12T18:09:38.337578(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Qwen/Qwen1.5-7B-Chat\n\n\n\nDataset automatically created during the evaluation run of model Qwen/Qwen1.5-7B-Chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T18:09:38.337578(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Qwen/Qwen1.5-7B-Chat\n\n\n\nDataset automatically created during the evaluation run of model Qwen/Qwen1.5-7B-Chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T18:09:38.337578(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
8ea7268102b651fa94bdd9f1eef9291ca32aca4b
# Dataset Card for "MarkedDataset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Lollitor/MarkedDataset
[ "region:us" ]
2024-02-12T18:21:48+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "ID", "dtype": "string"}, {"name": "LABEL", "dtype": "float64"}, {"name": "INPUT", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 5541840, "num_examples": 7638}, {"name": "validation", "num_bytes": 611074, "num_examples": 849}], "download_size": 3257011, "dataset_size": 6152914}}
2024-02-12T18:21:54+00:00
[]
[]
TAGS #region-us
# Dataset Card for "MarkedDataset" More Information needed
[ "# Dataset Card for \"MarkedDataset\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"MarkedDataset\"\n\nMore Information needed" ]
3978dc3d851e06cea72ec5cf6cfa94334c50d136
# ZNO dataset This dataset contains machine-readable questions and answers from Ukrainian External independent testing (called _ЗНО_/_ZNO_ in Ukrainian). Question subjects are: - History of Ukraine - Ukrainian language and literature Currently, only train subset (3063 question/answers) is released. We will release test set soon. ## File format Every line in a .jsonl file contains a structure like this: ```js { "question": "На другий склад падає наголос у слові", "answers": [ { "marker": "А", "text": "начинка" }, { "marker": "Б", "text": "випадок" }, { "marker": "В", "text": "дрова" }, { "marker": "Г", "text": "загадка" }, { "marker": "Д", "text": "русло" } ], "correct_answers": ["Д"], "subject": "ukrainian-language-and-literature" } ``` Currently, all questions have exactly one correct answer, stored in `correct_answers[0]`. ## Dataset structure | Subject | Subset | Size | |-------------------------------------|--------|----------------| |ukrainian-language-and-literature | train | 1925 questions | |history-of-ukraine | train | 1138 questions | ## Pre/postprocessing * Question text is converted to Markdown. * Questions with images are skipped (~600 cases). * Open-ended questions ("Write your thoughts about...") are skipped. * Questions that require matching multiple statements to multiple choices are skipped. * Links to full texts of literature pieces are removed. ## Sources: - https://zno.osvita.ua/comtests.html ## Other This dataset is used in [UNLP 2024 Shared Task](https://github.com/unlp-workshop/unlp-2024-shared-task)
osyvokon/zno
[ "license:mit", "region:us" ]
2024-02-12T18:36:01+00:00
{"license": "mit"}
2024-02-12T19:07:13+00:00
[]
[]
TAGS #license-mit #region-us
ZNO dataset =========== This dataset contains machine-readable questions and answers from Ukrainian External independent testing (called *ЗНО*/*ZNO* in Ukrainian). Question subjects are: * History of Ukraine * Ukrainian language and literature Currently, only train subset (3063 question/answers) is released. We will release test set soon. File format ----------- Every line in a .jsonl file contains a structure like this: Currently, all questions have exactly one correct answer, stored in 'correct\_answers[0]'. Dataset structure ----------------- Subject: ukrainian-language-and-literature, Subset: train, Size: 1925 questions Subject: history-of-ukraine, Subset: train, Size: 1138 questions Pre/postprocessing ------------------ * Question text is converted to Markdown. * Questions with images are skipped (~600 cases). * Open-ended questions ("Write your thoughts about...") are skipped. * Questions that require matching multiple statements to multiple choices are skipped. * Links to full texts of literature pieces are removed. Sources: -------- * URL Other ----- This dataset is used in UNLP 2024 Shared Task
[]
[ "TAGS\n#license-mit #region-us \n" ]
bc0b1723157cf2cf8557c1d515fc502f3e33be96
# Dataset Card for Evaluation run of paulml/OGNO-7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [paulml/OGNO-7B](https://huggingface.co/paulml/OGNO-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_paulml__OGNO-7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-12T18:37:16.795511](https://huggingface.co/datasets/open-llm-leaderboard/details_paulml__OGNO-7B/blob/main/results_2024-02-12T18-37-16.795511.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6522805182245757, "acc_stderr": 0.031991089349509304, "acc_norm": 0.6514462866675927, "acc_norm_stderr": 0.032662107614853914, "mc1": 0.6193390452876377, "mc1_stderr": 0.01699762787190791, "mc2": 0.7652268363883217, "mc2_stderr": 0.014007821850957183 }, "harness|arc:challenge|25": { "acc": 0.71160409556314, "acc_stderr": 0.013238394422428173, "acc_norm": 0.7312286689419796, "acc_norm_stderr": 0.012955065963710695 }, "harness|hellaswag|10": { "acc": 0.7151961760605458, "acc_stderr": 0.004503985839041969, "acc_norm": 0.8899621589324835, "acc_norm_stderr": 0.0031229736320394718 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6370370370370371, "acc_stderr": 0.04153948404742398, "acc_norm": 0.6370370370370371, "acc_norm_stderr": 0.04153948404742398 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7105263157894737, "acc_stderr": 0.03690677986137283, "acc_norm": 0.7105263157894737, "acc_norm_stderr": 0.03690677986137283 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.690566037735849, "acc_stderr": 0.028450154794118637, "acc_norm": 0.690566037735849, "acc_norm_stderr": 0.028450154794118637 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.29, "acc_stderr": 0.04560480215720684, "acc_norm": 0.29, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6589595375722543, "acc_stderr": 0.03614665424180826, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.03614665424180826 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.048971049527263666, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.048971049527263666 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.574468085106383, "acc_stderr": 0.03232146916224468, "acc_norm": 0.574468085106383, "acc_norm_stderr": 0.03232146916224468 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4649122807017544, "acc_stderr": 0.046920083813689104, "acc_norm": 0.4649122807017544, "acc_norm_stderr": 0.046920083813689104 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41005291005291006, "acc_stderr": 0.025331202438944427, "acc_norm": 0.41005291005291006, "acc_norm_stderr": 0.025331202438944427 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.49206349206349204, "acc_stderr": 0.044715725362943486, "acc_norm": 0.49206349206349204, "acc_norm_stderr": 0.044715725362943486 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7838709677419354, "acc_stderr": 0.023415293433568525, "acc_norm": 0.7838709677419354, "acc_norm_stderr": 0.023415293433568525 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5123152709359606, "acc_stderr": 0.035169204442208966, "acc_norm": 0.5123152709359606, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.803030303030303, "acc_stderr": 0.028335609732463362, "acc_norm": 0.803030303030303, "acc_norm_stderr": 0.028335609732463362 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.917098445595855, "acc_stderr": 0.01989934131572178, "acc_norm": 0.917098445595855, "acc_norm_stderr": 0.01989934131572178 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6641025641025641, "acc_stderr": 0.023946724741563973, "acc_norm": 0.6641025641025641, "acc_norm_stderr": 0.023946724741563973 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.028493465091028593, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.028493465091028593 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.030388353551886797, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.030388353551886797 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.37748344370860926, "acc_stderr": 0.0395802723112157, "acc_norm": 0.37748344370860926, "acc_norm_stderr": 0.0395802723112157 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8440366972477065, "acc_stderr": 0.015555802713590167, "acc_norm": 0.8440366972477065, "acc_norm_stderr": 0.015555802713590167 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5046296296296297, "acc_stderr": 0.03409825519163572, "acc_norm": 0.5046296296296297, "acc_norm_stderr": 0.03409825519163572 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8382352941176471, "acc_stderr": 0.02584501798692692, "acc_norm": 0.8382352941176471, "acc_norm_stderr": 0.02584501798692692 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.810126582278481, "acc_stderr": 0.02553010046023349, "acc_norm": 0.810126582278481, "acc_norm_stderr": 0.02553010046023349 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.034981493854624714, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.034981493854624714 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7603305785123967, "acc_stderr": 0.03896878985070416, "acc_norm": 0.7603305785123967, "acc_norm_stderr": 0.03896878985070416 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0401910747255735, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7852760736196319, "acc_stderr": 0.032262193772867744, "acc_norm": 0.7852760736196319, "acc_norm_stderr": 0.032262193772867744 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4107142857142857, "acc_stderr": 0.046695106638751906, "acc_norm": 0.4107142857142857, "acc_norm_stderr": 0.046695106638751906 }, "harness|hendrycksTest-management|5": { "acc": 0.7669902912621359, "acc_stderr": 0.04185832598928315, "acc_norm": 0.7669902912621359, "acc_norm_stderr": 0.04185832598928315 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.02093019318517933, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.02093019318517933 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8212005108556832, "acc_stderr": 0.013702643715368983, "acc_norm": 0.8212005108556832, "acc_norm_stderr": 0.013702643715368983 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7254335260115607, "acc_stderr": 0.02402774515526502, "acc_norm": 0.7254335260115607, "acc_norm_stderr": 0.02402774515526502 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.45139664804469276, "acc_stderr": 0.016643307372315872, "acc_norm": 0.45139664804469276, "acc_norm_stderr": 0.016643307372315872 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7254901960784313, "acc_stderr": 0.02555316999182652, "acc_norm": 0.7254901960784313, "acc_norm_stderr": 0.02555316999182652 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7170418006430869, "acc_stderr": 0.02558306248998481, "acc_norm": 0.7170418006430869, "acc_norm_stderr": 0.02558306248998481 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7376543209876543, "acc_stderr": 0.024477222856135114, "acc_norm": 0.7376543209876543, "acc_norm_stderr": 0.024477222856135114 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4858156028368794, "acc_stderr": 0.02981549448368206, "acc_norm": 0.4858156028368794, "acc_norm_stderr": 0.02981549448368206 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46870925684485004, "acc_stderr": 0.012745204626083138, "acc_norm": 0.46870925684485004, "acc_norm_stderr": 0.012745204626083138 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6838235294117647, "acc_stderr": 0.02824568739146292, "acc_norm": 0.6838235294117647, "acc_norm_stderr": 0.02824568739146292 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6683006535947712, "acc_stderr": 0.019047485239360378, "acc_norm": 0.6683006535947712, "acc_norm_stderr": 0.019047485239360378 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7387755102040816, "acc_stderr": 0.02812342933514278, "acc_norm": 0.7387755102040816, "acc_norm_stderr": 0.02812342933514278 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.025870646766169136, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.025870646766169136 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.85, "acc_stderr": 0.03588702812826371, "acc_norm": 0.85, "acc_norm_stderr": 0.03588702812826371 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.03869543323472101, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.6193390452876377, "mc1_stderr": 0.01699762787190791, "mc2": 0.7652268363883217, "mc2_stderr": 0.014007821850957183 }, "harness|winogrande|5": { "acc": 0.8468823993685872, "acc_stderr": 0.010120623252272956 }, "harness|gsm8k|5": { "acc": 0.7012888551933283, "acc_stderr": 0.012607137125693635 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_paulml__OGNO-7B
[ "region:us" ]
2024-02-12T18:39:36+00:00
{"pretty_name": "Evaluation run of paulml/OGNO-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [paulml/OGNO-7B](https://huggingface.co/paulml/OGNO-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_paulml__OGNO-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T18:37:16.795511](https://huggingface.co/datasets/open-llm-leaderboard/details_paulml__OGNO-7B/blob/main/results_2024-02-12T18-37-16.795511.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6522805182245757,\n \"acc_stderr\": 0.031991089349509304,\n \"acc_norm\": 0.6514462866675927,\n \"acc_norm_stderr\": 0.032662107614853914,\n \"mc1\": 0.6193390452876377,\n \"mc1_stderr\": 0.01699762787190791,\n \"mc2\": 0.7652268363883217,\n \"mc2_stderr\": 0.014007821850957183\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.71160409556314,\n \"acc_stderr\": 0.013238394422428173,\n \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710695\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7151961760605458,\n \"acc_stderr\": 0.004503985839041969,\n \"acc_norm\": 0.8899621589324835,\n \"acc_norm_stderr\": 0.0031229736320394718\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944427,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944427\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886797,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886797\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590167,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590167\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.046695106638751906,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.046695106638751906\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45139664804469276,\n \"acc_stderr\": 0.016643307372315872,\n \"acc_norm\": 0.45139664804469276,\n \"acc_norm_stderr\": 0.016643307372315872\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182652,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182652\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n \"acc_stderr\": 0.012745204626083138,\n \"acc_norm\": 0.46870925684485004,\n \"acc_norm_stderr\": 0.012745204626083138\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.019047485239360378,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.019047485239360378\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6193390452876377,\n \"mc1_stderr\": 0.01699762787190791,\n \"mc2\": 0.7652268363883217,\n \"mc2_stderr\": 0.014007821850957183\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8468823993685872,\n \"acc_stderr\": 0.010120623252272956\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7012888551933283,\n \"acc_stderr\": 0.012607137125693635\n }\n}\n```", "repo_url": "https://huggingface.co/paulml/OGNO-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|arc:challenge|25_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|gsm8k|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hellaswag|10_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T18-37-16.795511.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["**/details_harness|winogrande|5_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T18-37-16.795511.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T18_37_16.795511", "path": ["results_2024-02-12T18-37-16.795511.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T18-37-16.795511.parquet"]}]}]}
2024-02-12T18:40:02+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of paulml/OGNO-7B Dataset automatically created during the evaluation run of model paulml/OGNO-7B on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-12T18:37:16.795511(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of paulml/OGNO-7B\n\n\n\nDataset automatically created during the evaluation run of model paulml/OGNO-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T18:37:16.795511(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of paulml/OGNO-7B\n\n\n\nDataset automatically created during the evaluation run of model paulml/OGNO-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T18:37:16.795511(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
b32b3277643e7fdd9d29bcfc3b4d41d8920557bc
# Dataset Card for "CASFMarked" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Lollitor/CASFMarked
[ "region:us" ]
2024-02-12T18:42:25+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "ID", "dtype": "string"}, {"name": "INPUT", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 294982, "num_examples": 285}], "download_size": 120329, "dataset_size": 294982}}
2024-02-12T18:45:02+00:00
[]
[]
TAGS #region-us
# Dataset Card for "CASFMarked" More Information needed
[ "# Dataset Card for \"CASFMarked\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"CASFMarked\"\n\nMore Information needed" ]
7205897f1f3ee65e296072f3e96d49488e54e8ce
# MeMo corpus v1.1 Jens Bjerring-Hansen, Philip Diderichsen, Dorte Haltrup Hansen, June 2023 This is data release version 1.1 of the MeMo corpus comprising almost all Danish novels from the period 1870-1899, known as the Modern Breakthrough. The current version of the corpus is publicly viewable and searchable at <https://alf.hum.ku.dk/korp/?mode=memo_all>. The corpus has been enhanced since version 1.0 with the following 19 titles that have been reprocessed or added to the corpus. 1. Vilhelm Bergsøe: Bruden fra Rørvig (1872) 2. Johanne Schjørring: Rige Dage (1877) 3. Anonymous: Tante Jacobine (1878) 4. Jonas Lie: Rutland (1880) 5. Vilhelm Malling: Fra Kjøbstadlivet i gamle Dage (1882) 6. Adda Ravnkilde: To Fortællinger (1884) 7. Henrik Pontoppidan: Ung Elskov (1885) 8. Therese Brummer: Som man gifter sig (1888) 9. Henrik Pontoppidan: Natur (1890) 10. R.H.: En Kjøbenhavners Livshistorie eller Lykkens Omskiftelser (1891) 11. Henrik Pontoppidan: Minder (1893) 12. Johannes Jørgensen: Hjemvee (1894) 13. Henrik Pontoppidan: Nattevagt (1894) 14. Jonas Lie: Naar Sol gaar ned (1895) 15. Gustav Wied: Ungdomshistorier (1895) 16. Herman Bang: Ludvigsbakke (1896) 17. Cornelia Levetzow: Havemanden (1896) 18. Karl Larsen: Kresjan Vesterbro (1897) 19. Christian Christensen: Kærlighedens Mysterier (1899) The release contains the following files: | File | Contents | | :---------------------------------------- | :------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | texts | Text files of the now 558 novels in the corpus. The text has a newline at line breaks in the book, and two newlines at page breaks. Some of the texts (the ones originally set in Fraktur) have been post-OCR-corrected using a procedure described in Bjerring-Hansen et al. (2022). The rest have been post-OCR-corrected. Error types were identified manually and implementet with look-up in the dictionary (Sprogteknologisk Ordbase, STO) to awoid the creation of new errors. This cautious method has the consequence that not all error were corrected. | | normalised | Orthographically normalized versions of the 558 texts. Same format as the files in "texts", normalized to Danish standard spelling. Nouns were lower cased, aa changed to å and frequent character patterns changed to obey the Danish orthography norm from 1948. Like the error corrected version of the corpus, character patterns were identified manually and mainly implementet with look-up in the dictionary (Sprogteknologisk Ordbase, STO) to awoid overgeneration. The method has the consequence that not all words were normalized. | | memo_all.vrt | VRT file (vertical format) of MeMo corpus v1.1 for indexing in Corpus Workbench (CWB). Format: One token per line delimited by \<corpus>, \<text>, and \<sentence> XML elements. The XML elements contain attributes with metadata. The tokens are annotated with various categories separated by tabs. For more information about the metadata, see the metadata excel file. For more information about the token annotations, see below. | | MeMo-corpus-metadata-v1.1-2023-06-20.xlsx | Excel file with metadata about the novels in the corpus. See the "info" tab for information about the metadata categories. | **Token annotations and metadata in VRT file** There are nine columns of tokens and annotations in the corpus VRT file: | Column 1 | Column 2 | Column 3 | Column 4 | Column 5 | Column 6 | Column 7 | Column 8 | Column 9 | | :------- | :--------- | :--------- | :------------- | :------------------- | :--------------- | :--------------- | :--------------- | :------- | | Token | Normalized | Lemma form | Part of speech | Word no. in sentence | Word no. in line | Word no. in book | Line no. on page | Page no. | For information about the metadata also contained in the VRT file, se the file MeMo-corpus-metadata-v1.1-2023-06-20.xlsx. **References** Bjerring-Hansen, Jens, et al. "Mending Fractured Texts. A heuristic procedure for correcting OCR data." (2022). <https://ceur-ws.org/Vol-3232/paper14.pdf> **Data Statement** ## 1. Header 1. Dataset Title MeMo Corpus 2. Dataset Curator(s) [name, affiliation] Jens Bjerring-Hansen, University of Copenhagen; Philip Diderichsen, University of Copenhagen; Dorte Haltrup Hansen, University of Copenhagen 3. Dataset Version [version, date] Version 1.1, August 15, 2023 4. Dataset Citation and, if available, #### 5. DOI Data Statement #### 6. Author(s) [name, affiliation] Jens Bjerring-Hansen, University of Copenhagen; Philip Diderichsen, University of Copenhagen 7. Data Statement Version [version, date] Version 1, September 25, 2023 8. Data Statement Citation #### ## 2. Executive summary The MeMo corpus is established to investigate literary and cultural change in a seminal epoch of Scandinavian cultural and social history (known as 'the modern breakthrough') using natural language processing and other computational methods. The corpus consists of original novels by Norwegian and Danish authors printed in Denmark in the period 1870-99. It includes 858 volumes, totaling 4.5 million sentences and 65 million words. ## 3. Text characteristics The corpus consists of novels, i.e. long works of narrative fiction, usually written in prose and published as a book. The novels contain both dialogue and description. As instances of imaginative literature they are infused with ambiguity, interpretational confounding, rhetorical sophistication, and narrative layerings between author, narrator, and characters. The cultural diversity of the texts in the corpus is pronounced. From a genre perspective, we have contemporary novels as well as historical novels and other forms of genre fiction such as romance, crime, and war stories (cf. Bjerring-Hansen and Rasmussen, 2023). And from an aesthetic perspective we have both avant-garde forms of realism, including instances of naturalism and impressionism, and more traditional prose with a preference for abstract or generalized over concrete specification (cf. Bjerring-Hansen and Wilkens, 2023). Bjerring-Hansen, Jens, and Sebastian Ørntoft Rasmussen. 2023. “Litteratursociologi og kvantitative litteraturstudier Den historiske roman i det moderne gennembrud som case”. In Passage 89: 171–189. Bjerring-Hansen, Jens, and Matt Wilkens. 2023. “Deep distant reading: The rise of realism in Scandinavian literature as a case study”. Orbis Litterarum. [doi:10.1111/oli.12396](https://doi.org/10.1111/oli.12396) ## 4. Curation Rationale The MeMo Corpus was created as the basis for a research project, _MeMo – Measuring Modernity: Literary and Social Change in Scandinavia 1870-1900_, investigating how processes of social change in late nineteenth century Scandinavia were reflected and discussed in the novels from the period (project page: [https://nors.ku.dk/english/research/projects/measuring-modernity/](https://nors.ku.dk/english/research/projects/measuring-modernity/)). As opposed to traditional historiography on the period, which has focused on selected texts by a few prominent, male authors, our digital corpus, with rich metadata on texts and authors, allows for the capturing of robust literary and sociological trends and for new insights into the processes of modernization in this formative period in the literary and social history of Scandinavia. To this corpus we thus ask questions such as: How did this breakthrough of new ways of thinking and writing actually unfold? Who were the actors? And to what extent did newness relate to literature at large? Also, the corpus acts as the empirical foundation of an interrelated methodological project, _Mining the Meaning_, which aims to develop state-of-the-art computational semantic methods and training large language models towards written late 19th-century Danish and Norwegian (project page: [https://mime-memo.github.io/](https://mime-memo.github.io/)). Included in the corpus are all original (i.e. newly written) novels by Danish and Norwegian authors published in Denmark 1870-99. The list of texts was compiled on the basis of _Dansk Bogfortegnelse _(a continuous list of books published in Denmark since 1841; from 1861 published annually) supplemented with literary handbooks and special bibliographies. Not included (mainly due to pragmatic reasons and for the sake of coherence) in the corpus are: * reprints * translations * serializations (i.e. serialized novels from newspapers and magazines) * diasporic literature (i.e. novels by Danish emigrant authors in the U.S.) Around 20% of the novels are produced by female authors. Thus, highlighting and exploring the often overlooked female literary production of the period is a distinctive ambition of the corpus and the explorations based on it. ## 5. Language Varieties The language of the novels in the corpus is late nineteenth century Danish (BCP-47: da). On the whole, we are dealing with a more or less linguistically coherent body of texts. However, the following circumstances must be acknowledged: * The texts contain a pronounced spelling variation, partly on an individual level, partly explained by an ongoing orthographic standardization, which is most clearly expressed in the Spelling Reform of 1892. Here, forms such as 'Kjøbenhavn' and 'Familje' became 'København' and 'Familie'. * Some books are written in dialect (e.g. Jutlandic or West Norwegian) or contain dialectal features to create psychological individualism in the dialogue. * Approximately 16% of the books are written by Norwegian authors. In this regard it should be noted that, until 1907, written Norwegian was practically identical to written Danish. ‘Norvagisms’ (i.e. distinct Norwegian words, not used by Danes) do appear. ## 6. Preprocessing and data formatting **OCR scans**: The book volumes were scanned with optical character recognition (OCR) by the Royal Danish Library’s Digitization on Demand (DoD) team. The data were delivered as full volume PDF files with the OCR’ed text as an invisible searchable, copyable text layer, as full volume text files, and as single page text files (one text file per page for each volume). **OCR correction**: The text files were automatically post-corrected for OCR errors. This involved two different processes, one for texts originally typeset in Antikva (Roman) typefaces, one in Fraktur (Gothic) typefaces. The Antikva files were corrected using a set of hand-crafted substitution patterns, with look-up in the dictionary Sprogteknologisk Ordbase, STO (Eng. ‘Word database for language technology’). The Fraktur files were corrected using a correction procedure involving a combination of spelling correction, hand-crafted pattern substitution, and improved OCR using the pretrained “Fraktur” Tesseract data plus an alternative OCR layer from the pretrained “dan” Tesseract data, which was used as a corrective to problems with the Danish characters “æ” and “ø” in particular. This procedure improved the word error rate of the Fraktur data from 10.46% to 2.84% (cf. Bjerring-Hansen et al. 2022). Bjerring-Hansen, Jens, Philip Diderichsen, Dorte Haltrup Hansen, and Ross D. Kristensen-McLachlan. 2022. “Mending fractured texts. A heuristic procedure for correcting OCR.” Proceedings of the 6th Digital Humanities in the Nordic and Baltic Countries Conference, Uppsala, Sweden, March 15-18, 2022 (DHNB 2022): 177–186. **Token-level annotation**: The corrected data were annotated with grammatical information using the pipeline orchestration tool Text Tonsorium available at [https://cst.dk/texton/](https://cst.dk/texton/), provided by the Danish CLARIN node. The particular pipeline used included the LaPos part of speech tagger, the CSTLemma lemmatizer, and an implementation of the Brill tagger. Grammatical information included lemma and part of speech, plus sentence and paragraph segmentation (which are of course not strictly speaking token-level annotations). In addition to the grammatical annotations, convenience annotations with various counters were also added: word number in sentence, word number on line, word number in book volume, line number on page, page number in book volume. **Text normalization**: After OCR correction, all texts were normalized to modern Danish spelling using hand-crafted substitution patterns and lookup in STO (see above). Nouns were lower cased, “aa” changed to “å”, and frequent character patterns changed to obey modern Danish orthography. **VRT transformation**: After annotation with token-level categories and metadata, the data were transformed to a VRT file (vertical format) for indexing in Corpus Workbench (CWB). Format: One token per line delimited by &lt;corpus>, &lt;text>, and &lt;sentence> XML elements. The XML elements contain attributes with metadata. The tokens are annotated with the above-mentioned token-level annotations, separated by tabs. For more information about the metadata, see below. The data are available as: * OCR-corrected full volume text files * Normalized full volume versions of these text files * A single VRT file containing the whole corpus. ## 7. Limitations A standard limitation of data preprocessed and annotated using automatic natural language processing tools and procedures is that the results are not perfect. Thus, basically all the layers of the data can be assumed to be flawed: * Text data: The raw texts come from OCR scans of the physical book volumes. This process is not perfect, and although we have taken steps to mitigate errors, the basic text layer of the data can still be expected to have OCR errors (or wrong corrections) in 2-3% of tokens. * Normalized data: The normalization to modern Danish spelling as such should not be expected to be perfect either. We currently do not have estimates of the error rate in the normalized data. * Grammatical annotations: These are also added using automatic tools which cannot be expected to yield perfect results. We currently do not have estimates of error rates in the grammatical annotations. * Metadata: The metadata are hand-curated by literary scholars and should be close to perfect. However, the occasional human error can of course not be ruled out. ## 8. Metadata The metadata was curated with the help of students (Lasse Stein Holst, Lene Thanning Andersen, and Kirstine Nielsen Degn) on the basis of _Dansk Bogfortegnelse_ (1861-), [https://www.litteraturpriser.dk/](https://www.litteraturpriser.dk/), Ehrencron-Müller: _Anonym- og Pseudonym-Lexikon_ (1940) as well as additional literary and bibliographical handbooks. Among the metadata categories are the following: * file_id * filename * [author] firstname * [author] surname * [author] pseudonym * [author] gender [m/f/unknown] * [author] nationality [da/no/unknown] * title * subtitle * volume * year [of publication] * pages [in total] * illustrations [y/n] * typeface [gothic/roman] * publisher * price ## 9. Disclosure and Ethical Review Funding for the creation and curation is supplied by The Carlsberg Foundation through a Young Researcher Fellowship awarded to Jens Bjerring-Hansen, University of Copenhagen. In terms of data management, the project data (novels from 1870-1900) consist of imaginative texts by non-living authors. The texts are out-of-copyright. From a GDPR perspective, the biographical, bibliographical and demographic data are historical as well as non-sensitive.
MiMe-MeMo/Corpus-v1.1
[ "language:da", "license:cc-by-4.0", "region:us" ]
2024-02-12T19:12:03+00:00
{"language": ["da"], "license": "cc-by-4.0"}
2024-02-16T08:28:28+00:00
[]
[ "da" ]
TAGS #language-Danish #license-cc-by-4.0 #region-us
MeMo corpus v1.1 ================ Jens Bjerring-Hansen, Philip Diderichsen, Dorte Haltrup Hansen, June 2023 This is data release version 1.1 of the MeMo corpus comprising almost all Danish novels from the period 1870-1899, known as the Modern Breakthrough. The current version of the corpus is publicly viewable and searchable at <URL The corpus has been enhanced since version 1.0 with the following 19 titles that have been reprocessed or added to the corpus. 1. Vilhelm Bergsøe: Bruden fra Rørvig (1872) 2. Johanne Schjørring: Rige Dage (1877) 3. Anonymous: Tante Jacobine (1878) 4. Jonas Lie: Rutland (1880) 5. Vilhelm Malling: Fra Kjøbstadlivet i gamle Dage (1882) 6. Adda Ravnkilde: To Fortællinger (1884) 7. Henrik Pontoppidan: Ung Elskov (1885) 8. Therese Brummer: Som man gifter sig (1888) 9. Henrik Pontoppidan: Natur (1890) 10. R.H.: En Kjøbenhavners Livshistorie eller Lykkens Omskiftelser (1891) 11. Henrik Pontoppidan: Minder (1893) 12. Johannes Jørgensen: Hjemvee (1894) 13. Henrik Pontoppidan: Nattevagt (1894) 14. Jonas Lie: Naar Sol gaar ned (1895) 15. Gustav Wied: Ungdomshistorier (1895) 16. Herman Bang: Ludvigsbakke (1896) 17. Cornelia Levetzow: Havemanden (1896) 18. Karl Larsen: Kresjan Vesterbro (1897) 19. Christian Christensen: Kærlighedens Mysterier (1899) The release contains the following files: Token annotations and metadata in VRT file There are nine columns of tokens and annotations in the corpus VRT file: For information about the metadata also contained in the VRT file, se the file URL. References Bjerring-Hansen, Jens, et al. "Mending Fractured Texts. A heuristic procedure for correcting OCR data." (2022). <URL Data Statement 1. Header --------- ``` 1. Dataset Title MeMo Corpus 2. Dataset Curator(s) [name, affiliation] Jens Bjerring-Hansen, University of Copenhagen; Philip Diderichsen, University of Copenhagen; Dorte Haltrup Hansen, University of Copenhagen 3. Dataset Version [version, date] Version 1.1, August 15, 2023 4. Dataset Citation and, if available, #### 5. DOI Data Statement #### 6. Author(s) [name, affiliation] Jens Bjerring-Hansen, University of Copenhagen; Philip Diderichsen, University of Copenhagen 7. Data Statement Version [version, date] Version 1, September 25, 2023 8. Data Statement Citation #### ``` 2. Executive summary -------------------- The MeMo corpus is established to investigate literary and cultural change in a seminal epoch of Scandinavian cultural and social history (known as 'the modern breakthrough') using natural language processing and other computational methods. The corpus consists of original novels by Norwegian and Danish authors printed in Denmark in the period 1870-99. It includes 858 volumes, totaling 4.5 million sentences and 65 million words. 3. Text characteristics ----------------------- The corpus consists of novels, i.e. long works of narrative fiction, usually written in prose and published as a book. The novels contain both dialogue and description. As instances of imaginative literature they are infused with ambiguity, interpretational confounding, rhetorical sophistication, and narrative layerings between author, narrator, and characters. The cultural diversity of the texts in the corpus is pronounced. From a genre perspective, we have contemporary novels as well as historical novels and other forms of genre fiction such as romance, crime, and war stories (cf. Bjerring-Hansen and Rasmussen, 2023). And from an aesthetic perspective we have both avant-garde forms of realism, including instances of naturalism and impressionism, and more traditional prose with a preference for abstract or generalized over concrete specification (cf. Bjerring-Hansen and Wilkens, 2023). Bjerring-Hansen, Jens, and Sebastian Ørntoft Rasmussen. 2023. “Litteratursociologi og kvantitative litteraturstudier Den historiske roman i det moderne gennembrud som case”. In Passage 89: 171–189. Bjerring-Hansen, Jens, and Matt Wilkens. 2023. “Deep distant reading: The rise of realism in Scandinavian literature as a case study”. Orbis Litterarum. doi:10.1111/oli.12396 4. Curation Rationale --------------------- The MeMo Corpus was created as the basis for a research project, *MeMo – Measuring Modernity: Literary and Social Change in Scandinavia 1870-1900*, investigating how processes of social change in late nineteenth century Scandinavia were reflected and discussed in the novels from the period (project page: URL As opposed to traditional historiography on the period, which has focused on selected texts by a few prominent, male authors, our digital corpus, with rich metadata on texts and authors, allows for the capturing of robust literary and sociological trends and for new insights into the processes of modernization in this formative period in the literary and social history of Scandinavia. To this corpus we thus ask questions such as: How did this breakthrough of new ways of thinking and writing actually unfold? Who were the actors? And to what extent did newness relate to literature at large? Also, the corpus acts as the empirical foundation of an interrelated methodological project, *Mining the Meaning*, which aims to develop state-of-the-art computational semantic methods and training large language models towards written late 19th-century Danish and Norwegian (project page: URL Included in the corpus are all original (i.e. newly written) novels by Danish and Norwegian authors published in Denmark 1870-99. The list of texts was compiled on the basis of \_Dansk Bogfortegnelse \_(a continuous list of books published in Denmark since 1841; from 1861 published annually) supplemented with literary handbooks and special bibliographies. Not included (mainly due to pragmatic reasons and for the sake of coherence) in the corpus are: * reprints * translations * serializations (i.e. serialized novels from newspapers and magazines) * diasporic literature (i.e. novels by Danish emigrant authors in the U.S.) Around 20% of the novels are produced by female authors. Thus, highlighting and exploring the often overlooked female literary production of the period is a distinctive ambition of the corpus and the explorations based on it. 5. Language Varieties --------------------- The language of the novels in the corpus is late nineteenth century Danish (BCP-47: da). On the whole, we are dealing with a more or less linguistically coherent body of texts. However, the following circumstances must be acknowledged: * The texts contain a pronounced spelling variation, partly on an individual level, partly explained by an ongoing orthographic standardization, which is most clearly expressed in the Spelling Reform of 1892. Here, forms such as 'Kjøbenhavn' and 'Familje' became 'København' and 'Familie'. * Some books are written in dialect (e.g. Jutlandic or West Norwegian) or contain dialectal features to create psychological individualism in the dialogue. * Approximately 16% of the books are written by Norwegian authors. In this regard it should be noted that, until 1907, written Norwegian was practically identical to written Danish. ‘Norvagisms’ (i.e. distinct Norwegian words, not used by Danes) do appear. 6. Preprocessing and data formatting ------------------------------------ OCR scans: The book volumes were scanned with optical character recognition (OCR) by the Royal Danish Library’s Digitization on Demand (DoD) team. The data were delivered as full volume PDF files with the OCR’ed text as an invisible searchable, copyable text layer, as full volume text files, and as single page text files (one text file per page for each volume). OCR correction: The text files were automatically post-corrected for OCR errors. This involved two different processes, one for texts originally typeset in Antikva (Roman) typefaces, one in Fraktur (Gothic) typefaces. The Antikva files were corrected using a set of hand-crafted substitution patterns, with look-up in the dictionary Sprogteknologisk Ordbase, STO (Eng. ‘Word database for language technology’). The Fraktur files were corrected using a correction procedure involving a combination of spelling correction, hand-crafted pattern substitution, and improved OCR using the pretrained “Fraktur” Tesseract data plus an alternative OCR layer from the pretrained “dan” Tesseract data, which was used as a corrective to problems with the Danish characters “æ” and “ø” in particular. This procedure improved the word error rate of the Fraktur data from 10.46% to 2.84% (cf. Bjerring-Hansen et al. 2022). Bjerring-Hansen, Jens, Philip Diderichsen, Dorte Haltrup Hansen, and Ross D. Kristensen-McLachlan. 2022. “Mending fractured texts. A heuristic procedure for correcting OCR.” Proceedings of the 6th Digital Humanities in the Nordic and Baltic Countries Conference, Uppsala, Sweden, March 15-18, 2022 (DHNB 2022): 177–186. Token-level annotation: The corrected data were annotated with grammatical information using the pipeline orchestration tool Text Tonsorium available at URL provided by the Danish CLARIN node. The particular pipeline used included the LaPos part of speech tagger, the CSTLemma lemmatizer, and an implementation of the Brill tagger. Grammatical information included lemma and part of speech, plus sentence and paragraph segmentation (which are of course not strictly speaking token-level annotations). In addition to the grammatical annotations, convenience annotations with various counters were also added: word number in sentence, word number on line, word number in book volume, line number on page, page number in book volume. Text normalization: After OCR correction, all texts were normalized to modern Danish spelling using hand-crafted substitution patterns and lookup in STO (see above). Nouns were lower cased, “aa” changed to “å”, and frequent character patterns changed to obey modern Danish orthography. VRT transformation: After annotation with token-level categories and metadata, the data were transformed to a VRT file (vertical format) for indexing in Corpus Workbench (CWB). Format: One token per line delimited by <corpus>, <text>, and <sentence> XML elements. The XML elements contain attributes with metadata. The tokens are annotated with the above-mentioned token-level annotations, separated by tabs. For more information about the metadata, see below. The data are available as: * OCR-corrected full volume text files * Normalized full volume versions of these text files * A single VRT file containing the whole corpus. 7. Limitations -------------- A standard limitation of data preprocessed and annotated using automatic natural language processing tools and procedures is that the results are not perfect. Thus, basically all the layers of the data can be assumed to be flawed: * Text data: The raw texts come from OCR scans of the physical book volumes. This process is not perfect, and although we have taken steps to mitigate errors, the basic text layer of the data can still be expected to have OCR errors (or wrong corrections) in 2-3% of tokens. * Normalized data: The normalization to modern Danish spelling as such should not be expected to be perfect either. We currently do not have estimates of the error rate in the normalized data. * Grammatical annotations: These are also added using automatic tools which cannot be expected to yield perfect results. We currently do not have estimates of error rates in the grammatical annotations. * Metadata: The metadata are hand-curated by literary scholars and should be close to perfect. However, the occasional human error can of course not be ruled out. 8. Metadata ----------- The metadata was curated with the help of students (Lasse Stein Holst, Lene Thanning Andersen, and Kirstine Nielsen Degn) on the basis of *Dansk Bogfortegnelse* (1861-), URL Ehrencron-Müller: *Anonym- og Pseudonym-Lexikon* (1940) as well as additional literary and bibliographical handbooks. Among the metadata categories are the following: * file\_id * filename * [author] firstname * [author] surname * [author] pseudonym * [author] gender [m/f/unknown] * [author] nationality [da/no/unknown] * title * subtitle * volume * year [of publication] * pages [in total] * illustrations [y/n] * typeface [gothic/roman] * publisher * price 9. Disclosure and Ethical Review -------------------------------- Funding for the creation and curation is supplied by The Carlsberg Foundation through a Young Researcher Fellowship awarded to Jens Bjerring-Hansen, University of Copenhagen. In terms of data management, the project data (novels from 1870-1900) consist of imaginative texts by non-living authors. The texts are out-of-copyright. From a GDPR perspective, the biographical, bibliographical and demographic data are historical as well as non-sensitive.
[]
[ "TAGS\n#language-Danish #license-cc-by-4.0 #region-us \n" ]
f45b8dc8bfa4fadae222f97e694e785c46222ea3
# Dataset Card for "FSProtein" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Lollitor/FSProtein
[ "region:us" ]
2024-02-12T19:23:02+00:00
{"dataset_info": {"features": [{"name": "#code", "dtype": "string"}, {"name": "inputs", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 16350621, "num_examples": 16245}], "download_size": 1806661, "dataset_size": 16350621}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-02-12T19:23:05+00:00
[]
[]
TAGS #region-us
# Dataset Card for "FSProtein" More Information needed
[ "# Dataset Card for \"FSProtein\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"FSProtein\"\n\nMore Information needed" ]
62b5a44828bbc493cdeb1140b3e0ed59d4d397de
# Dataset Card for "FSPocket" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Lollitor/FSPocket
[ "region:us" ]
2024-02-12T19:23:35+00:00
{"dataset_info": {"features": [{"name": "#code", "dtype": "string"}, {"name": "inputs", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3771291, "num_examples": 16245}], "download_size": 885637, "dataset_size": 3771291}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-02-12T19:23:38+00:00
[]
[]
TAGS #region-us
# Dataset Card for "FSPocket" More Information needed
[ "# Dataset Card for \"FSPocket\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"FSPocket\"\n\nMore Information needed" ]
90cdcb82de30f04d084599dc37fa054776862e8e
# Credit Card Fraud Detection This dataset was downloaded from https://www.kaggle.com/datasets/mlg-ulb/creditcardfraud/data adn uploaded for educational purposes.
David-Egea/Creditcard-fraud-detection
[ "license:mit", "region:us" ]
2024-02-12T19:24:27+00:00
{"license": "mit"}
2024-02-12T19:37:37+00:00
[]
[]
TAGS #license-mit #region-us
# Credit Card Fraud Detection This dataset was downloaded from URL adn uploaded for educational purposes.
[ "# Credit Card Fraud Detection\n\nThis dataset was downloaded from URL adn uploaded for educational purposes." ]
[ "TAGS\n#license-mit #region-us \n", "# Credit Card Fraud Detection\n\nThis dataset was downloaded from URL adn uploaded for educational purposes." ]
db898e14572a07e05ce62b6f633c92473ae3d088
German azureml translation of [argilla/distilabel-math-preference-dpo](https://huggingface.co/datasets/argilla/distilabel-math-preference-dpo) for dpo finetuning.
mayflowergmbh/distilabel-math-preference-dpo-de
[ "task_categories:text-generation", "language:de", "license:apache-2.0", "math", "region:us" ]
2024-02-12T19:26:40+00:00
{"language": ["de"], "license": "apache-2.0", "task_categories": ["text-generation"], "tags": ["math"]}
2024-02-14T13:39:24+00:00
[]
[ "de" ]
TAGS #task_categories-text-generation #language-German #license-apache-2.0 #math #region-us
German azureml translation of argilla/distilabel-math-preference-dpo for dpo finetuning.
[]
[ "TAGS\n#task_categories-text-generation #language-German #license-apache-2.0 #math #region-us \n" ]
69852d07352c1754561704aea47f0d253f8a1c02
# Dataset Card for "FSMarked" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Lollitor/FSMarked
[ "region:us" ]
2024-02-12T19:29:19+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "ID", "dtype": "string"}, {"name": "INPUT", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 17636085, "num_examples": 16245}], "download_size": 261423, "dataset_size": 17636085}}
2024-02-12T19:29:21+00:00
[]
[]
TAGS #region-us
# Dataset Card for "FSMarked" More Information needed
[ "# Dataset Card for \"FSMarked\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"FSMarked\"\n\nMore Information needed" ]
84362fe6e402e4655701f9fe82e3d98baf9d06cf
# Dataset Card for Evaluation run of shahzebnaveed/codeparrot-ds <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [shahzebnaveed/codeparrot-ds](https://huggingface.co/shahzebnaveed/codeparrot-ds) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_shahzebnaveed__codeparrot-ds", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-12T19:38:09.822527](https://huggingface.co/datasets/open-llm-leaderboard/details_shahzebnaveed__codeparrot-ds/blob/main/results_2024-02-12T19-38-09.822527.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.23201649345939454, "acc_stderr": 0.029926255967504155, "acc_norm": 0.23187502288285278, "acc_norm_stderr": 0.03071406840692922, "mc1": 0.25703794369645044, "mc1_stderr": 0.015298077509485083, "mc2": 0.508529859261986, "mc2_stderr": 0.016866502520252444 }, "harness|arc:challenge|25": { "acc": 0.2158703071672355, "acc_stderr": 0.012022975360030672, "acc_norm": 0.2525597269624573, "acc_norm_stderr": 0.012696728980207706 }, "harness|hellaswag|10": { "acc": 0.25672176857199763, "acc_stderr": 0.0043593182064286815, "acc_norm": 0.2575184226249751, "acc_norm_stderr": 0.004363736410689627 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.22, "acc_stderr": 0.04163331998932268, "acc_norm": 0.22, "acc_norm_stderr": 0.04163331998932268 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.18518518518518517, "acc_stderr": 0.03355677216313142, "acc_norm": 0.18518518518518517, "acc_norm_stderr": 0.03355677216313142 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.17763157894736842, "acc_stderr": 0.031103182383123398, "acc_norm": 0.17763157894736842, "acc_norm_stderr": 0.031103182383123398 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.21509433962264152, "acc_stderr": 0.02528839450289137, "acc_norm": 0.21509433962264152, "acc_norm_stderr": 0.02528839450289137 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2569444444444444, "acc_stderr": 0.03653946969442099, "acc_norm": 0.2569444444444444, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.2, "acc_stderr": 0.04020151261036845, "acc_norm": 0.2, "acc_norm_stderr": 0.04020151261036845 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.26, "acc_stderr": 0.0440844002276808, "acc_norm": 0.26, "acc_norm_stderr": 0.0440844002276808 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.20809248554913296, "acc_stderr": 0.030952890217749874, "acc_norm": 0.20809248554913296, "acc_norm_stderr": 0.030952890217749874 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.21568627450980393, "acc_stderr": 0.04092563958237654, "acc_norm": 0.21568627450980393, "acc_norm_stderr": 0.04092563958237654 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.28, "acc_stderr": 0.045126085985421276, "acc_norm": 0.28, "acc_norm_stderr": 0.045126085985421276 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.26382978723404255, "acc_stderr": 0.028809989854102973, "acc_norm": 0.26382978723404255, "acc_norm_stderr": 0.028809989854102973 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.23684210526315788, "acc_stderr": 0.039994238792813365, "acc_norm": 0.23684210526315788, "acc_norm_stderr": 0.039994238792813365 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2413793103448276, "acc_stderr": 0.03565998174135302, "acc_norm": 0.2413793103448276, "acc_norm_stderr": 0.03565998174135302 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.20899470899470898, "acc_stderr": 0.02094048156533486, "acc_norm": 0.20899470899470898, "acc_norm_stderr": 0.02094048156533486 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2857142857142857, "acc_stderr": 0.04040610178208841, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.04040610178208841 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.18, "acc_stderr": 0.038612291966536934, "acc_norm": 0.18, "acc_norm_stderr": 0.038612291966536934 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.1774193548387097, "acc_stderr": 0.02173254068932927, "acc_norm": 0.1774193548387097, "acc_norm_stderr": 0.02173254068932927 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.15270935960591134, "acc_stderr": 0.02530890453938063, "acc_norm": 0.15270935960591134, "acc_norm_stderr": 0.02530890453938063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.25, "acc_stderr": 0.04351941398892446, "acc_norm": 0.25, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.21212121212121213, "acc_stderr": 0.03192271569548299, "acc_norm": 0.21212121212121213, "acc_norm_stderr": 0.03192271569548299 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.17676767676767677, "acc_stderr": 0.027178752639044915, "acc_norm": 0.17676767676767677, "acc_norm_stderr": 0.027178752639044915 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.19689119170984457, "acc_stderr": 0.028697873971860664, "acc_norm": 0.19689119170984457, "acc_norm_stderr": 0.028697873971860664 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.20256410256410257, "acc_stderr": 0.020377660970371372, "acc_norm": 0.20256410256410257, "acc_norm_stderr": 0.020377660970371372 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.2111111111111111, "acc_stderr": 0.024882116857655075, "acc_norm": 0.2111111111111111, "acc_norm_stderr": 0.024882116857655075 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.21008403361344538, "acc_stderr": 0.026461398717471874, "acc_norm": 0.21008403361344538, "acc_norm_stderr": 0.026461398717471874 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.1986754966887417, "acc_stderr": 0.03257847384436776, "acc_norm": 0.1986754966887417, "acc_norm_stderr": 0.03257847384436776 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.1926605504587156, "acc_stderr": 0.016909276884936094, "acc_norm": 0.1926605504587156, "acc_norm_stderr": 0.016909276884936094 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.1527777777777778, "acc_stderr": 0.024536326026134224, "acc_norm": 0.1527777777777778, "acc_norm_stderr": 0.024536326026134224 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.25, "acc_stderr": 0.03039153369274154, "acc_norm": 0.25, "acc_norm_stderr": 0.03039153369274154 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.270042194092827, "acc_stderr": 0.028900721906293426, "acc_norm": 0.270042194092827, "acc_norm_stderr": 0.028900721906293426 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.31390134529147984, "acc_stderr": 0.031146796482972465, "acc_norm": 0.31390134529147984, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2595419847328244, "acc_stderr": 0.03844876139785271, "acc_norm": 0.2595419847328244, "acc_norm_stderr": 0.03844876139785271 }, "harness|hendrycksTest-international_law|5": { "acc": 0.2396694214876033, "acc_stderr": 0.03896878985070417, "acc_norm": 0.2396694214876033, "acc_norm_stderr": 0.03896878985070417 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.25925925925925924, "acc_stderr": 0.042365112580946336, "acc_norm": 0.25925925925925924, "acc_norm_stderr": 0.042365112580946336 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.22085889570552147, "acc_stderr": 0.032591773927421776, "acc_norm": 0.22085889570552147, "acc_norm_stderr": 0.032591773927421776 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3125, "acc_stderr": 0.043994650575715215, "acc_norm": 0.3125, "acc_norm_stderr": 0.043994650575715215 }, "harness|hendrycksTest-management|5": { "acc": 0.17475728155339806, "acc_stderr": 0.037601780060266224, "acc_norm": 0.17475728155339806, "acc_norm_stderr": 0.037601780060266224 }, "harness|hendrycksTest-marketing|5": { "acc": 0.2905982905982906, "acc_stderr": 0.02974504857267404, "acc_norm": 0.2905982905982906, "acc_norm_stderr": 0.02974504857267404 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.23754789272030652, "acc_stderr": 0.015218733046150193, "acc_norm": 0.23754789272030652, "acc_norm_stderr": 0.015218733046150193 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.24855491329479767, "acc_stderr": 0.023267528432100174, "acc_norm": 0.24855491329479767, "acc_norm_stderr": 0.023267528432100174 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.23798882681564246, "acc_stderr": 0.014242630070574915, "acc_norm": 0.23798882681564246, "acc_norm_stderr": 0.014242630070574915 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.22549019607843138, "acc_stderr": 0.023929155517351284, "acc_norm": 0.22549019607843138, "acc_norm_stderr": 0.023929155517351284 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.1864951768488746, "acc_stderr": 0.02212243977248077, "acc_norm": 0.1864951768488746, "acc_norm_stderr": 0.02212243977248077 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.21604938271604937, "acc_stderr": 0.022899162918445806, "acc_norm": 0.21604938271604937, "acc_norm_stderr": 0.022899162918445806 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.23404255319148937, "acc_stderr": 0.025257861359432417, "acc_norm": 0.23404255319148937, "acc_norm_stderr": 0.025257861359432417 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.2457627118644068, "acc_stderr": 0.010996156635142692, "acc_norm": 0.2457627118644068, "acc_norm_stderr": 0.010996156635142692 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.18382352941176472, "acc_stderr": 0.023529242185193106, "acc_norm": 0.18382352941176472, "acc_norm_stderr": 0.023529242185193106 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.25, "acc_stderr": 0.01751781884501444, "acc_norm": 0.25, "acc_norm_stderr": 0.01751781884501444 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.21818181818181817, "acc_stderr": 0.03955932861795833, "acc_norm": 0.21818181818181817, "acc_norm_stderr": 0.03955932861795833 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.18775510204081633, "acc_stderr": 0.02500025603954621, "acc_norm": 0.18775510204081633, "acc_norm_stderr": 0.02500025603954621 }, "harness|hendrycksTest-sociology|5": { "acc": 0.24378109452736318, "acc_stderr": 0.03036049015401465, "acc_norm": 0.24378109452736318, "acc_norm_stderr": 0.03036049015401465 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.28, "acc_stderr": 0.04512608598542128, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-virology|5": { "acc": 0.28313253012048195, "acc_stderr": 0.03507295431370518, "acc_norm": 0.28313253012048195, "acc_norm_stderr": 0.03507295431370518 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.3216374269005848, "acc_stderr": 0.03582529442573122, "acc_norm": 0.3216374269005848, "acc_norm_stderr": 0.03582529442573122 }, "harness|truthfulqa:mc|0": { "mc1": 0.25703794369645044, "mc1_stderr": 0.015298077509485083, "mc2": 0.508529859261986, "mc2_stderr": 0.016866502520252444 }, "harness|winogrande|5": { "acc": 0.5098658247829518, "acc_stderr": 0.014049749833367589 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_shahzebnaveed__codeparrot-ds
[ "region:us" ]
2024-02-12T19:39:39+00:00
{"pretty_name": "Evaluation run of shahzebnaveed/codeparrot-ds", "dataset_summary": "Dataset automatically created during the evaluation run of model [shahzebnaveed/codeparrot-ds](https://huggingface.co/shahzebnaveed/codeparrot-ds) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_shahzebnaveed__codeparrot-ds\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T19:38:09.822527](https://huggingface.co/datasets/open-llm-leaderboard/details_shahzebnaveed__codeparrot-ds/blob/main/results_2024-02-12T19-38-09.822527.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23201649345939454,\n \"acc_stderr\": 0.029926255967504155,\n \"acc_norm\": 0.23187502288285278,\n \"acc_norm_stderr\": 0.03071406840692922,\n \"mc1\": 0.25703794369645044,\n \"mc1_stderr\": 0.015298077509485083,\n \"mc2\": 0.508529859261986,\n \"mc2_stderr\": 0.016866502520252444\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2158703071672355,\n \"acc_stderr\": 0.012022975360030672,\n \"acc_norm\": 0.2525597269624573,\n \"acc_norm_stderr\": 0.012696728980207706\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25672176857199763,\n \"acc_stderr\": 0.0043593182064286815,\n \"acc_norm\": 0.2575184226249751,\n \"acc_norm_stderr\": 0.004363736410689627\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.03192271569548299,\n \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.03192271569548299\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25703794369645044,\n \"mc1_stderr\": 0.015298077509485083,\n \"mc2\": 0.508529859261986,\n \"mc2_stderr\": 0.016866502520252444\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5098658247829518,\n \"acc_stderr\": 0.014049749833367589\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/shahzebnaveed/codeparrot-ds", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|arc:challenge|25_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|gsm8k|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hellaswag|10_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T19-38-09.822527.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["**/details_harness|winogrande|5_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T19-38-09.822527.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T19_38_09.822527", "path": ["results_2024-02-12T19-38-09.822527.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T19-38-09.822527.parquet"]}]}]}
2024-02-12T19:40:03+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of shahzebnaveed/codeparrot-ds Dataset automatically created during the evaluation run of model shahzebnaveed/codeparrot-ds on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-12T19:38:09.822527(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of shahzebnaveed/codeparrot-ds\n\n\n\nDataset automatically created during the evaluation run of model shahzebnaveed/codeparrot-ds on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T19:38:09.822527(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of shahzebnaveed/codeparrot-ds\n\n\n\nDataset automatically created during the evaluation run of model shahzebnaveed/codeparrot-ds on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T19:38:09.822527(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
c1fb74b358755b2883fa6daa20300ccd61d0df17
# Dataset Card for Evaluation run of indischepartij/MiniCPM-3B-Hercules-v2.0 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [indischepartij/MiniCPM-3B-Hercules-v2.0](https://huggingface.co/indischepartij/MiniCPM-3B-Hercules-v2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_indischepartij__MiniCPM-3B-Hercules-v2.0", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-12T19:38:09.464033](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__MiniCPM-3B-Hercules-v2.0/blob/main/results_2024-02-12T19-38-09.464033.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5172565982528783, "acc_stderr": 0.03443539534813297, "acc_norm": 0.5200192814805001, "acc_norm_stderr": 0.03514172263989754, "mc1": 0.2607099143206854, "mc1_stderr": 0.015368841620766372, "mc2": 0.40370896766601344, "mc2_stderr": 0.014397183168821114 }, "harness|arc:challenge|25": { "acc": 0.4052901023890785, "acc_stderr": 0.014346869060229325, "acc_norm": 0.4325938566552901, "acc_norm_stderr": 0.01447800569418253 }, "harness|hellaswag|10": { "acc": 0.5245966938856802, "acc_stderr": 0.004983740145218602, "acc_norm": 0.711113324039036, "acc_norm_stderr": 0.004523188431142896 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.43703703703703706, "acc_stderr": 0.04284958639753399, "acc_norm": 0.43703703703703706, "acc_norm_stderr": 0.04284958639753399 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5723684210526315, "acc_stderr": 0.040260970832965634, "acc_norm": 0.5723684210526315, "acc_norm_stderr": 0.040260970832965634 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.55, "acc_stderr": 0.049999999999999996, "acc_norm": 0.55, "acc_norm_stderr": 0.049999999999999996 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5433962264150943, "acc_stderr": 0.030656748696739428, "acc_norm": 0.5433962264150943, "acc_norm_stderr": 0.030656748696739428 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6041666666666666, "acc_stderr": 0.04089465449325583, "acc_norm": 0.6041666666666666, "acc_norm_stderr": 0.04089465449325583 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.45, "acc_stderr": 0.04999999999999999, "acc_norm": 0.45, "acc_norm_stderr": 0.04999999999999999 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5144508670520231, "acc_stderr": 0.03810871630454764, "acc_norm": 0.5144508670520231, "acc_norm_stderr": 0.03810871630454764 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.38235294117647056, "acc_stderr": 0.04835503696107223, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.04835503696107223 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.40425531914893614, "acc_stderr": 0.03208115750788684, "acc_norm": 0.40425531914893614, "acc_norm_stderr": 0.03208115750788684 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.34210526315789475, "acc_stderr": 0.04462917535336936, "acc_norm": 0.34210526315789475, "acc_norm_stderr": 0.04462917535336936 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4827586206896552, "acc_stderr": 0.04164188720169377, "acc_norm": 0.4827586206896552, "acc_norm_stderr": 0.04164188720169377 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.36772486772486773, "acc_stderr": 0.024833839825562427, "acc_norm": 0.36772486772486773, "acc_norm_stderr": 0.024833839825562427 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.30952380952380953, "acc_stderr": 0.041349130183033156, "acc_norm": 0.30952380952380953, "acc_norm_stderr": 0.041349130183033156 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6451612903225806, "acc_stderr": 0.02721888977330876, "acc_norm": 0.6451612903225806, "acc_norm_stderr": 0.02721888977330876 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4236453201970443, "acc_stderr": 0.03476725747649037, "acc_norm": 0.4236453201970443, "acc_norm_stderr": 0.03476725747649037 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.5575757575757576, "acc_stderr": 0.03878372113711274, "acc_norm": 0.5575757575757576, "acc_norm_stderr": 0.03878372113711274 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6919191919191919, "acc_stderr": 0.03289477330098617, "acc_norm": 0.6919191919191919, "acc_norm_stderr": 0.03289477330098617 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7253886010362695, "acc_stderr": 0.03221024508041152, "acc_norm": 0.7253886010362695, "acc_norm_stderr": 0.03221024508041152 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.48205128205128206, "acc_stderr": 0.025334667080954932, "acc_norm": 0.48205128205128206, "acc_norm_stderr": 0.025334667080954932 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.29259259259259257, "acc_stderr": 0.027738969632176088, "acc_norm": 0.29259259259259257, "acc_norm_stderr": 0.027738969632176088 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5630252100840336, "acc_stderr": 0.03221943636566196, "acc_norm": 0.5630252100840336, "acc_norm_stderr": 0.03221943636566196 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.271523178807947, "acc_stderr": 0.036313298039696525, "acc_norm": 0.271523178807947, "acc_norm_stderr": 0.036313298039696525 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.6862385321100918, "acc_stderr": 0.019894723341469127, "acc_norm": 0.6862385321100918, "acc_norm_stderr": 0.019894723341469127 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.36574074074074076, "acc_stderr": 0.03284738857647207, "acc_norm": 0.36574074074074076, "acc_norm_stderr": 0.03284738857647207 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6029411764705882, "acc_stderr": 0.03434131164719129, "acc_norm": 0.6029411764705882, "acc_norm_stderr": 0.03434131164719129 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6455696202531646, "acc_stderr": 0.031137304297185812, "acc_norm": 0.6455696202531646, "acc_norm_stderr": 0.031137304297185812 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5829596412556054, "acc_stderr": 0.03309266936071721, "acc_norm": 0.5829596412556054, "acc_norm_stderr": 0.03309266936071721 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6335877862595419, "acc_stderr": 0.04225875451969637, "acc_norm": 0.6335877862595419, "acc_norm_stderr": 0.04225875451969637 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6859504132231405, "acc_stderr": 0.04236964753041018, "acc_norm": 0.6859504132231405, "acc_norm_stderr": 0.04236964753041018 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5740740740740741, "acc_stderr": 0.0478034362693679, "acc_norm": 0.5740740740740741, "acc_norm_stderr": 0.0478034362693679 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6257668711656442, "acc_stderr": 0.03802068102899615, "acc_norm": 0.6257668711656442, "acc_norm_stderr": 0.03802068102899615 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.30357142857142855, "acc_stderr": 0.04364226155841044, "acc_norm": 0.30357142857142855, "acc_norm_stderr": 0.04364226155841044 }, "harness|hendrycksTest-management|5": { "acc": 0.6407766990291263, "acc_stderr": 0.04750458399041696, "acc_norm": 0.6407766990291263, "acc_norm_stderr": 0.04750458399041696 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8076923076923077, "acc_stderr": 0.02581923325648371, "acc_norm": 0.8076923076923077, "acc_norm_stderr": 0.02581923325648371 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6666666666666666, "acc_stderr": 0.01685739124747255, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.01685739124747255 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5867052023121387, "acc_stderr": 0.02651126136940924, "acc_norm": 0.5867052023121387, "acc_norm_stderr": 0.02651126136940924 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.25921787709497207, "acc_stderr": 0.014655780837497736, "acc_norm": 0.25921787709497207, "acc_norm_stderr": 0.014655780837497736 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.565359477124183, "acc_stderr": 0.028384256704883037, "acc_norm": 0.565359477124183, "acc_norm_stderr": 0.028384256704883037 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5884244372990354, "acc_stderr": 0.02795048149440127, "acc_norm": 0.5884244372990354, "acc_norm_stderr": 0.02795048149440127 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5709876543209876, "acc_stderr": 0.027538925613470863, "acc_norm": 0.5709876543209876, "acc_norm_stderr": 0.027538925613470863 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.36879432624113473, "acc_stderr": 0.028782227561347254, "acc_norm": 0.36879432624113473, "acc_norm_stderr": 0.028782227561347254 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.39048239895697523, "acc_stderr": 0.012460135913945075, "acc_norm": 0.39048239895697523, "acc_norm_stderr": 0.012460135913945075 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.44485294117647056, "acc_stderr": 0.03018753206032939, "acc_norm": 0.44485294117647056, "acc_norm_stderr": 0.03018753206032939 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.4820261437908497, "acc_stderr": 0.020214761037872404, "acc_norm": 0.4820261437908497, "acc_norm_stderr": 0.020214761037872404 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5727272727272728, "acc_stderr": 0.047381987035454834, "acc_norm": 0.5727272727272728, "acc_norm_stderr": 0.047381987035454834 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6285714285714286, "acc_stderr": 0.030932858792789848, "acc_norm": 0.6285714285714286, "acc_norm_stderr": 0.030932858792789848 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7213930348258707, "acc_stderr": 0.031700561834973086, "acc_norm": 0.7213930348258707, "acc_norm_stderr": 0.031700561834973086 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.71, "acc_stderr": 0.04560480215720684, "acc_norm": 0.71, "acc_norm_stderr": 0.04560480215720684 }, "harness|hendrycksTest-virology|5": { "acc": 0.46987951807228917, "acc_stderr": 0.03885425420866766, "acc_norm": 0.46987951807228917, "acc_norm_stderr": 0.03885425420866766 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7134502923976608, "acc_stderr": 0.03467826685703826, "acc_norm": 0.7134502923976608, "acc_norm_stderr": 0.03467826685703826 }, "harness|truthfulqa:mc|0": { "mc1": 0.2607099143206854, "mc1_stderr": 0.015368841620766372, "mc2": 0.40370896766601344, "mc2_stderr": 0.014397183168821114 }, "harness|winogrande|5": { "acc": 0.664561957379637, "acc_stderr": 0.013269575904851434 }, "harness|gsm8k|5": { "acc": 0.42077331311599697, "acc_stderr": 0.013598489497182835 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_indischepartij__MiniCPM-3B-Hercules-v2.0
[ "region:us" ]
2024-02-12T19:39:50+00:00
{"pretty_name": "Evaluation run of indischepartij/MiniCPM-3B-Hercules-v2.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [indischepartij/MiniCPM-3B-Hercules-v2.0](https://huggingface.co/indischepartij/MiniCPM-3B-Hercules-v2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_indischepartij__MiniCPM-3B-Hercules-v2.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T19:38:09.464033](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__MiniCPM-3B-Hercules-v2.0/blob/main/results_2024-02-12T19-38-09.464033.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5172565982528783,\n \"acc_stderr\": 0.03443539534813297,\n \"acc_norm\": 0.5200192814805001,\n \"acc_norm_stderr\": 0.03514172263989754,\n \"mc1\": 0.2607099143206854,\n \"mc1_stderr\": 0.015368841620766372,\n \"mc2\": 0.40370896766601344,\n \"mc2_stderr\": 0.014397183168821114\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4052901023890785,\n \"acc_stderr\": 0.014346869060229325,\n \"acc_norm\": 0.4325938566552901,\n \"acc_norm_stderr\": 0.01447800569418253\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5245966938856802,\n \"acc_stderr\": 0.004983740145218602,\n \"acc_norm\": 0.711113324039036,\n \"acc_norm_stderr\": 0.004523188431142896\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.040260970832965634,\n \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.040260970832965634\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5433962264150943,\n \"acc_stderr\": 0.030656748696739428,\n \"acc_norm\": 0.5433962264150943,\n \"acc_norm_stderr\": 0.030656748696739428\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6041666666666666,\n \"acc_stderr\": 0.04089465449325583,\n \"acc_norm\": 0.6041666666666666,\n \"acc_norm_stderr\": 0.04089465449325583\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.04999999999999999\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.03208115750788684,\n \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.03208115750788684\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.04164188720169377,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.04164188720169377\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36772486772486773,\n \"acc_stderr\": 0.024833839825562427,\n \"acc_norm\": 0.36772486772486773,\n \"acc_norm_stderr\": 0.024833839825562427\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n \"acc_stderr\": 0.041349130183033156,\n \"acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.041349130183033156\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6451612903225806,\n \"acc_stderr\": 0.02721888977330876,\n \"acc_norm\": 0.6451612903225806,\n \"acc_norm_stderr\": 0.02721888977330876\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4236453201970443,\n \"acc_stderr\": 0.03476725747649037,\n \"acc_norm\": 0.4236453201970443,\n \"acc_norm_stderr\": 0.03476725747649037\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5575757575757576,\n \"acc_stderr\": 0.03878372113711274,\n \"acc_norm\": 0.5575757575757576,\n \"acc_norm_stderr\": 0.03878372113711274\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6919191919191919,\n \"acc_stderr\": 0.03289477330098617,\n \"acc_norm\": 0.6919191919191919,\n \"acc_norm_stderr\": 0.03289477330098617\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7253886010362695,\n \"acc_stderr\": 0.03221024508041152,\n \"acc_norm\": 0.7253886010362695,\n \"acc_norm_stderr\": 0.03221024508041152\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.48205128205128206,\n \"acc_stderr\": 0.025334667080954932,\n \"acc_norm\": 0.48205128205128206,\n \"acc_norm_stderr\": 0.025334667080954932\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.03221943636566196,\n \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.03221943636566196\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.036313298039696525,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6862385321100918,\n \"acc_stderr\": 0.019894723341469127,\n \"acc_norm\": 0.6862385321100918,\n \"acc_norm_stderr\": 0.019894723341469127\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.36574074074074076,\n \"acc_stderr\": 0.03284738857647207,\n \"acc_norm\": 0.36574074074074076,\n \"acc_norm_stderr\": 0.03284738857647207\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.03434131164719129,\n \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.03434131164719129\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6455696202531646,\n \"acc_stderr\": 0.031137304297185812,\n \"acc_norm\": 0.6455696202531646,\n \"acc_norm_stderr\": 0.031137304297185812\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5829596412556054,\n \"acc_stderr\": 0.03309266936071721,\n \"acc_norm\": 0.5829596412556054,\n \"acc_norm_stderr\": 0.03309266936071721\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969637,\n \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969637\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6859504132231405,\n \"acc_stderr\": 0.04236964753041018,\n \"acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.04236964753041018\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.0478034362693679,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.0478034362693679\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6257668711656442,\n \"acc_stderr\": 0.03802068102899615,\n \"acc_norm\": 0.6257668711656442,\n \"acc_norm_stderr\": 0.03802068102899615\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.04750458399041696,\n \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.04750458399041696\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n \"acc_stderr\": 0.02581923325648371,\n \"acc_norm\": 0.8076923076923077,\n \"acc_norm_stderr\": 0.02581923325648371\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.01685739124747255,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.01685739124747255\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5867052023121387,\n \"acc_stderr\": 0.02651126136940924,\n \"acc_norm\": 0.5867052023121387,\n \"acc_norm_stderr\": 0.02651126136940924\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25921787709497207,\n \"acc_stderr\": 0.014655780837497736,\n \"acc_norm\": 0.25921787709497207,\n \"acc_norm_stderr\": 0.014655780837497736\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.565359477124183,\n \"acc_stderr\": 0.028384256704883037,\n \"acc_norm\": 0.565359477124183,\n \"acc_norm_stderr\": 0.028384256704883037\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n \"acc_stderr\": 0.02795048149440127,\n \"acc_norm\": 0.5884244372990354,\n \"acc_norm_stderr\": 0.02795048149440127\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5709876543209876,\n \"acc_stderr\": 0.027538925613470863,\n \"acc_norm\": 0.5709876543209876,\n \"acc_norm_stderr\": 0.027538925613470863\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.36879432624113473,\n \"acc_stderr\": 0.028782227561347254,\n \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.028782227561347254\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39048239895697523,\n \"acc_stderr\": 0.012460135913945075,\n \"acc_norm\": 0.39048239895697523,\n \"acc_norm_stderr\": 0.012460135913945075\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.03018753206032939,\n \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.03018753206032939\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4820261437908497,\n \"acc_stderr\": 0.020214761037872404,\n \"acc_norm\": 0.4820261437908497,\n \"acc_norm_stderr\": 0.020214761037872404\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5727272727272728,\n \"acc_stderr\": 0.047381987035454834,\n \"acc_norm\": 0.5727272727272728,\n \"acc_norm_stderr\": 0.047381987035454834\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.030932858792789848,\n \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.030932858792789848\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7213930348258707,\n \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.7213930348258707,\n \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n \"mc1_stderr\": 0.015368841620766372,\n \"mc2\": 0.40370896766601344,\n \"mc2_stderr\": 0.014397183168821114\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.664561957379637,\n \"acc_stderr\": 0.013269575904851434\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.42077331311599697,\n \"acc_stderr\": 0.013598489497182835\n }\n}\n```", "repo_url": "https://huggingface.co/indischepartij/MiniCPM-3B-Hercules-v2.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|arc:challenge|25_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|gsm8k|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hellaswag|10_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T19-38-09.464033.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["**/details_harness|winogrande|5_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T19-38-09.464033.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T19_38_09.464033", "path": ["results_2024-02-12T19-38-09.464033.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T19-38-09.464033.parquet"]}]}]}
2024-02-12T19:40:16+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of indischepartij/MiniCPM-3B-Hercules-v2.0 Dataset automatically created during the evaluation run of model indischepartij/MiniCPM-3B-Hercules-v2.0 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-12T19:38:09.464033(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of indischepartij/MiniCPM-3B-Hercules-v2.0\n\n\n\nDataset automatically created during the evaluation run of model indischepartij/MiniCPM-3B-Hercules-v2.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T19:38:09.464033(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of indischepartij/MiniCPM-3B-Hercules-v2.0\n\n\n\nDataset automatically created during the evaluation run of model indischepartij/MiniCPM-3B-Hercules-v2.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T19:38:09.464033(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
7946d04aa51bb02a36576879b5c2ec073dbb987e
Part of a dataset to convert text to functions for a robot this is just raw possible instructions, the text versions have not been made yet, about 2 million raw functions, 10K converted to words
VatsaDev/robofunctions
[ "license:mit", "region:us" ]
2024-02-12T19:41:58+00:00
{"license": "mit"}
2024-02-13T17:57:15+00:00
[]
[]
TAGS #license-mit #region-us
Part of a dataset to convert text to functions for a robot this is just raw possible instructions, the text versions have not been made yet, about 2 million raw functions, 10K converted to words
[]
[ "TAGS\n#license-mit #region-us \n" ]
b44a8abc0d64c70fc00807d95c97a6d055eeaf6d
# Dataset Card for Evaluation run of indischepartij/MiniCPM-3B-OpenHermes-2.5-v2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [indischepartij/MiniCPM-3B-OpenHermes-2.5-v2](https://huggingface.co/indischepartij/MiniCPM-3B-OpenHermes-2.5-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_indischepartij__MiniCPM-3B-OpenHermes-2.5-v2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-12T20:03:47.867498](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__MiniCPM-3B-OpenHermes-2.5-v2/blob/main/results_2024-02-12T20-03-47.867498.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5274704484249705, "acc_stderr": 0.03438524173597624, "acc_norm": 0.532836900103673, "acc_norm_stderr": 0.035101557474168814, "mc1": 0.2864137086903305, "mc1_stderr": 0.015826142439502342, "mc2": 0.4227589065713519, "mc2_stderr": 0.014631323260578519 }, "harness|arc:challenge|25": { "acc": 0.4308873720136519, "acc_stderr": 0.014471133392642468, "acc_norm": 0.47440273037542663, "acc_norm_stderr": 0.014592230885298967 }, "harness|hellaswag|10": { "acc": 0.5351523600876319, "acc_stderr": 0.004977434505403354, "acc_norm": 0.7199761003784106, "acc_norm_stderr": 0.004480929450281562 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.43703703703703706, "acc_stderr": 0.04284958639753399, "acc_norm": 0.43703703703703706, "acc_norm_stderr": 0.04284958639753399 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6118421052631579, "acc_stderr": 0.03965842097512744, "acc_norm": 0.6118421052631579, "acc_norm_stderr": 0.03965842097512744 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5773584905660377, "acc_stderr": 0.03040233144576954, "acc_norm": 0.5773584905660377, "acc_norm_stderr": 0.03040233144576954 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6111111111111112, "acc_stderr": 0.04076663253918567, "acc_norm": 0.6111111111111112, "acc_norm_stderr": 0.04076663253918567 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.28, "acc_stderr": 0.04512608598542127, "acc_norm": 0.28, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5260115606936416, "acc_stderr": 0.03807301726504513, "acc_norm": 0.5260115606936416, "acc_norm_stderr": 0.03807301726504513 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.04897104952726367, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.04897104952726367 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4127659574468085, "acc_stderr": 0.03218471141400351, "acc_norm": 0.4127659574468085, "acc_norm_stderr": 0.03218471141400351 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.34210526315789475, "acc_stderr": 0.04462917535336936, "acc_norm": 0.34210526315789475, "acc_norm_stderr": 0.04462917535336936 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.46206896551724136, "acc_stderr": 0.041546596717075474, "acc_norm": 0.46206896551724136, "acc_norm_stderr": 0.041546596717075474 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.35714285714285715, "acc_stderr": 0.02467786284133278, "acc_norm": 0.35714285714285715, "acc_norm_stderr": 0.02467786284133278 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.35714285714285715, "acc_stderr": 0.04285714285714281, "acc_norm": 0.35714285714285715, "acc_norm_stderr": 0.04285714285714281 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6709677419354839, "acc_stderr": 0.02672949906834996, "acc_norm": 0.6709677419354839, "acc_norm_stderr": 0.02672949906834996 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5172413793103449, "acc_stderr": 0.035158955511656986, "acc_norm": 0.5172413793103449, "acc_norm_stderr": 0.035158955511656986 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.5515151515151515, "acc_stderr": 0.038835659779569286, "acc_norm": 0.5515151515151515, "acc_norm_stderr": 0.038835659779569286 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7070707070707071, "acc_stderr": 0.03242497958178815, "acc_norm": 0.7070707070707071, "acc_norm_stderr": 0.03242497958178815 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7202072538860104, "acc_stderr": 0.03239637046735704, "acc_norm": 0.7202072538860104, "acc_norm_stderr": 0.03239637046735704 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5025641025641026, "acc_stderr": 0.025350672979412202, "acc_norm": 0.5025641025641026, "acc_norm_stderr": 0.025350672979412202 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3074074074074074, "acc_stderr": 0.028133252578815635, "acc_norm": 0.3074074074074074, "acc_norm_stderr": 0.028133252578815635 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5672268907563025, "acc_stderr": 0.032183581077426124, "acc_norm": 0.5672268907563025, "acc_norm_stderr": 0.032183581077426124 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.31125827814569534, "acc_stderr": 0.03780445850526732, "acc_norm": 0.31125827814569534, "acc_norm_stderr": 0.03780445850526732 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.689908256880734, "acc_stderr": 0.019830849684439756, "acc_norm": 0.689908256880734, "acc_norm_stderr": 0.019830849684439756 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.39814814814814814, "acc_stderr": 0.033384734032074016, "acc_norm": 0.39814814814814814, "acc_norm_stderr": 0.033384734032074016 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.5882352941176471, "acc_stderr": 0.034542365853806094, "acc_norm": 0.5882352941176471, "acc_norm_stderr": 0.034542365853806094 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.6455696202531646, "acc_stderr": 0.031137304297185812, "acc_norm": 0.6455696202531646, "acc_norm_stderr": 0.031137304297185812 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.5829596412556054, "acc_stderr": 0.03309266936071721, "acc_norm": 0.5829596412556054, "acc_norm_stderr": 0.03309266936071721 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6259541984732825, "acc_stderr": 0.042438692422305246, "acc_norm": 0.6259541984732825, "acc_norm_stderr": 0.042438692422305246 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6859504132231405, "acc_stderr": 0.04236964753041018, "acc_norm": 0.6859504132231405, "acc_norm_stderr": 0.04236964753041018 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5740740740740741, "acc_stderr": 0.047803436269367894, "acc_norm": 0.5740740740740741, "acc_norm_stderr": 0.047803436269367894 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.6503067484662577, "acc_stderr": 0.03746668325470021, "acc_norm": 0.6503067484662577, "acc_norm_stderr": 0.03746668325470021 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.375, "acc_stderr": 0.04595091388086298, "acc_norm": 0.375, "acc_norm_stderr": 0.04595091388086298 }, "harness|hendrycksTest-management|5": { "acc": 0.6699029126213593, "acc_stderr": 0.04656147110012351, "acc_norm": 0.6699029126213593, "acc_norm_stderr": 0.04656147110012351 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8162393162393162, "acc_stderr": 0.025372139671722933, "acc_norm": 0.8162393162393162, "acc_norm_stderr": 0.025372139671722933 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.49, "acc_stderr": 0.05024183937956911, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956911 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6756066411238825, "acc_stderr": 0.016740929047162696, "acc_norm": 0.6756066411238825, "acc_norm_stderr": 0.016740929047162696 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6040462427745664, "acc_stderr": 0.026329813341946243, "acc_norm": 0.6040462427745664, "acc_norm_stderr": 0.026329813341946243 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3005586592178771, "acc_stderr": 0.01533456680625116, "acc_norm": 0.3005586592178771, "acc_norm_stderr": 0.01533456680625116 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6111111111111112, "acc_stderr": 0.027914055510468, "acc_norm": 0.6111111111111112, "acc_norm_stderr": 0.027914055510468 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5819935691318328, "acc_stderr": 0.028013651891995076, "acc_norm": 0.5819935691318328, "acc_norm_stderr": 0.028013651891995076 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5740740740740741, "acc_stderr": 0.027513747284379424, "acc_norm": 0.5740740740740741, "acc_norm_stderr": 0.027513747284379424 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3829787234042553, "acc_stderr": 0.028999080904806185, "acc_norm": 0.3829787234042553, "acc_norm_stderr": 0.028999080904806185 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.394393741851369, "acc_stderr": 0.012482141665631183, "acc_norm": 0.394393741851369, "acc_norm_stderr": 0.012482141665631183 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4411764705882353, "acc_stderr": 0.03016191193076711, "acc_norm": 0.4411764705882353, "acc_norm_stderr": 0.03016191193076711 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.4869281045751634, "acc_stderr": 0.02022092082962691, "acc_norm": 0.4869281045751634, "acc_norm_stderr": 0.02022092082962691 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6, "acc_stderr": 0.0469237132203465, "acc_norm": 0.6, "acc_norm_stderr": 0.0469237132203465 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.636734693877551, "acc_stderr": 0.03078905113903081, "acc_norm": 0.636734693877551, "acc_norm_stderr": 0.03078905113903081 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7213930348258707, "acc_stderr": 0.031700561834973086, "acc_norm": 0.7213930348258707, "acc_norm_stderr": 0.031700561834973086 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-virology|5": { "acc": 0.42771084337349397, "acc_stderr": 0.038515976837185335, "acc_norm": 0.42771084337349397, "acc_norm_stderr": 0.038515976837185335 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7602339181286549, "acc_stderr": 0.032744852119469564, "acc_norm": 0.7602339181286549, "acc_norm_stderr": 0.032744852119469564 }, "harness|truthfulqa:mc|0": { "mc1": 0.2864137086903305, "mc1_stderr": 0.015826142439502342, "mc2": 0.4227589065713519, "mc2_stderr": 0.014631323260578519 }, "harness|winogrande|5": { "acc": 0.654301499605367, "acc_stderr": 0.01336659695193438 }, "harness|gsm8k|5": { "acc": 0.312357846853677, "acc_stderr": 0.012765850404191419 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_indischepartij__MiniCPM-3B-OpenHermes-2.5-v2
[ "region:us" ]
2024-02-12T19:49:25+00:00
{"pretty_name": "Evaluation run of indischepartij/MiniCPM-3B-OpenHermes-2.5-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [indischepartij/MiniCPM-3B-OpenHermes-2.5-v2](https://huggingface.co/indischepartij/MiniCPM-3B-OpenHermes-2.5-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_indischepartij__MiniCPM-3B-OpenHermes-2.5-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T20:03:47.867498](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__MiniCPM-3B-OpenHermes-2.5-v2/blob/main/results_2024-02-12T20-03-47.867498.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5274704484249705,\n \"acc_stderr\": 0.03438524173597624,\n \"acc_norm\": 0.532836900103673,\n \"acc_norm_stderr\": 0.035101557474168814,\n \"mc1\": 0.2864137086903305,\n \"mc1_stderr\": 0.015826142439502342,\n \"mc2\": 0.4227589065713519,\n \"mc2_stderr\": 0.014631323260578519\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4308873720136519,\n \"acc_stderr\": 0.014471133392642468,\n \"acc_norm\": 0.47440273037542663,\n \"acc_norm_stderr\": 0.014592230885298967\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5351523600876319,\n \"acc_stderr\": 0.004977434505403354,\n \"acc_norm\": 0.7199761003784106,\n \"acc_norm_stderr\": 0.004480929450281562\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5773584905660377,\n \"acc_stderr\": 0.03040233144576954,\n \"acc_norm\": 0.5773584905660377,\n \"acc_norm_stderr\": 0.03040233144576954\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n \"acc_stderr\": 0.03807301726504513,\n \"acc_norm\": 0.5260115606936416,\n \"acc_norm_stderr\": 0.03807301726504513\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726367,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726367\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.02467786284133278,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.02467786284133278\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n \"acc_stderr\": 0.02672949906834996,\n \"acc_norm\": 0.6709677419354839,\n \"acc_norm_stderr\": 0.02672949906834996\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5515151515151515,\n \"acc_stderr\": 0.038835659779569286,\n \"acc_norm\": 0.5515151515151515,\n \"acc_norm_stderr\": 0.038835659779569286\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7070707070707071,\n \"acc_stderr\": 0.03242497958178815,\n \"acc_norm\": 0.7070707070707071,\n \"acc_norm_stderr\": 0.03242497958178815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.03239637046735704,\n \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.03239637046735704\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5025641025641026,\n \"acc_stderr\": 0.025350672979412202,\n \"acc_norm\": 0.5025641025641026,\n \"acc_norm_stderr\": 0.025350672979412202\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815635,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815635\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.032183581077426124,\n \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.032183581077426124\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.689908256880734,\n \"acc_stderr\": 0.019830849684439756,\n \"acc_norm\": 0.689908256880734,\n \"acc_norm_stderr\": 0.019830849684439756\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n \"acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.034542365853806094,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.034542365853806094\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6455696202531646,\n \"acc_stderr\": 0.031137304297185812,\n \"acc_norm\": 0.6455696202531646,\n \"acc_norm_stderr\": 0.031137304297185812\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5829596412556054,\n \"acc_stderr\": 0.03309266936071721,\n \"acc_norm\": 0.5829596412556054,\n \"acc_norm_stderr\": 0.03309266936071721\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6859504132231405,\n \"acc_stderr\": 0.04236964753041018,\n \"acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.04236964753041018\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.047803436269367894,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.047803436269367894\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.03746668325470021,\n \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.03746668325470021\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.04656147110012351,\n \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.04656147110012351\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8162393162393162,\n \"acc_stderr\": 0.025372139671722933,\n \"acc_norm\": 0.8162393162393162,\n \"acc_norm_stderr\": 0.025372139671722933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6756066411238825,\n \"acc_stderr\": 0.016740929047162696,\n \"acc_norm\": 0.6756066411238825,\n \"acc_norm_stderr\": 0.016740929047162696\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6040462427745664,\n \"acc_stderr\": 0.026329813341946243,\n \"acc_norm\": 0.6040462427745664,\n \"acc_norm_stderr\": 0.026329813341946243\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3005586592178771,\n \"acc_stderr\": 0.01533456680625116,\n \"acc_norm\": 0.3005586592178771,\n \"acc_norm_stderr\": 0.01533456680625116\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510468,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510468\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5819935691318328,\n \"acc_stderr\": 0.028013651891995076,\n \"acc_norm\": 0.5819935691318328,\n \"acc_norm_stderr\": 0.028013651891995076\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.027513747284379424,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.027513747284379424\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.028999080904806185,\n \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.028999080904806185\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.394393741851369,\n \"acc_stderr\": 0.012482141665631183,\n \"acc_norm\": 0.394393741851369,\n \"acc_norm_stderr\": 0.012482141665631183\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.03016191193076711,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.03016191193076711\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4869281045751634,\n \"acc_stderr\": 0.02022092082962691,\n \"acc_norm\": 0.4869281045751634,\n \"acc_norm_stderr\": 0.02022092082962691\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.03078905113903081,\n \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.03078905113903081\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7213930348258707,\n \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.7213930348258707,\n \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.032744852119469564,\n \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.032744852119469564\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2864137086903305,\n \"mc1_stderr\": 0.015826142439502342,\n \"mc2\": 0.4227589065713519,\n \"mc2_stderr\": 0.014631323260578519\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.654301499605367,\n \"acc_stderr\": 0.01336659695193438\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.312357846853677,\n \"acc_stderr\": 0.012765850404191419\n }\n}\n```", "repo_url": "https://huggingface.co/indischepartij/MiniCPM-3B-OpenHermes-2.5-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|arc:challenge|25_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|arc:challenge|25_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|gsm8k|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|gsm8k|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hellaswag|10_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hellaswag|10_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T19-47-44.078712.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T20-03-47.867498.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["**/details_harness|winogrande|5_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["**/details_harness|winogrande|5_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T20-03-47.867498.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T19_47_44.078712", "path": ["results_2024-02-12T19-47-44.078712.parquet"]}, {"split": "2024_02_12T20_03_47.867498", "path": ["results_2024-02-12T20-03-47.867498.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T20-03-47.867498.parquet"]}]}]}
2024-02-12T20:05:33+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of indischepartij/MiniCPM-3B-OpenHermes-2.5-v2 Dataset automatically created during the evaluation run of model indischepartij/MiniCPM-3B-OpenHermes-2.5-v2 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-12T20:03:47.867498(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of indischepartij/MiniCPM-3B-OpenHermes-2.5-v2\n\n\n\nDataset automatically created during the evaluation run of model indischepartij/MiniCPM-3B-OpenHermes-2.5-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T20:03:47.867498(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of indischepartij/MiniCPM-3B-OpenHermes-2.5-v2\n\n\n\nDataset automatically created during the evaluation run of model indischepartij/MiniCPM-3B-OpenHermes-2.5-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T20:03:47.867498(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
649e44360aee9cb4fb35a7c7bf39f30db7e350c8
# World Health Organization (WHO) Epidemiological Update - Edition 163 (for embeddings) train.pnf is taken from the [WHO website](https://www.who.int/publications/m/item/covid-19-epidemiological-update---19-january-2024) test.csv was generated by GPT-3.5-turbo All text is chunked to a length of 500 tokens with 10% overlap.
gadkins/who-covid-19-epidemiological-update-edition-163
[ "task_categories:text-generation", "size_categories:n<1K", "language:en", "fine-tuning", "touch rugby", "region:us" ]
2024-02-12T19:59:10+00:00
{"language": ["en"], "size_categories": ["n<1K"], "task_categories": ["text-generation"], "tags": ["fine-tuning", "touch rugby"]}
2024-02-16T21:13:29+00:00
[]
[ "en" ]
TAGS #task_categories-text-generation #size_categories-n<1K #language-English #fine-tuning #touch rugby #region-us
# World Health Organization (WHO) Epidemiological Update - Edition 163 (for embeddings) URL is taken from the WHO website URL was generated by GPT-3.5-turbo All text is chunked to a length of 500 tokens with 10% overlap.
[ "# World Health Organization (WHO) Epidemiological Update - Edition 163 (for embeddings)\n\nURL is taken from the WHO website\n\nURL was generated by GPT-3.5-turbo\n\nAll text is chunked to a length of 500 tokens with 10% overlap." ]
[ "TAGS\n#task_categories-text-generation #size_categories-n<1K #language-English #fine-tuning #touch rugby #region-us \n", "# World Health Organization (WHO) Epidemiological Update - Edition 163 (for embeddings)\n\nURL is taken from the WHO website\n\nURL was generated by GPT-3.5-turbo\n\nAll text is chunked to a length of 500 tokens with 10% overlap." ]
bf114692b5e480301b041bc603b4705be624d545
# Dataset Card for Evaluation run of andysalerno/rainbowfish-7B-v9 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [andysalerno/rainbowfish-7B-v9](https://huggingface.co/andysalerno/rainbowfish-7B-v9) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_andysalerno__rainbowfish-7B-v9", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-12T20:14:46.042064](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__rainbowfish-7B-v9/blob/main/results_2024-02-12T20-14-46.042064.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6270678144594366, "acc_stderr": 0.03256950382813242, "acc_norm": 0.6331149722051534, "acc_norm_stderr": 0.033232016922454914, "mc1": 0.32558139534883723, "mc1_stderr": 0.016403989469907825, "mc2": 0.488208971713914, "mc2_stderr": 0.01514417520712263 }, "harness|arc:challenge|25": { "acc": 0.5827645051194539, "acc_stderr": 0.01440982551840308, "acc_norm": 0.6177474402730375, "acc_norm_stderr": 0.014200454049979277 }, "harness|hellaswag|10": { "acc": 0.6320454092810197, "acc_stderr": 0.004812633280078265, "acc_norm": 0.8243377813184625, "acc_norm_stderr": 0.003797548252851631 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6222222222222222, "acc_stderr": 0.04188307537595852, "acc_norm": 0.6222222222222222, "acc_norm_stderr": 0.04188307537595852 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6447368421052632, "acc_stderr": 0.03894734487013317, "acc_norm": 0.6447368421052632, "acc_norm_stderr": 0.03894734487013317 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.690566037735849, "acc_stderr": 0.028450154794118637, "acc_norm": 0.690566037735849, "acc_norm_stderr": 0.028450154794118637 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7430555555555556, "acc_stderr": 0.03653946969442099, "acc_norm": 0.7430555555555556, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.05021167315686779, "acc_norm": 0.52, "acc_norm_stderr": 0.05021167315686779 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.04878317312145632, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6184971098265896, "acc_stderr": 0.03703851193099521, "acc_norm": 0.6184971098265896, "acc_norm_stderr": 0.03703851193099521 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.37254901960784315, "acc_stderr": 0.04810840148082636, "acc_norm": 0.37254901960784315, "acc_norm_stderr": 0.04810840148082636 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.79, "acc_stderr": 0.04093601807403326, "acc_norm": 0.79, "acc_norm_stderr": 0.04093601807403326 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5531914893617021, "acc_stderr": 0.0325005368436584, "acc_norm": 0.5531914893617021, "acc_norm_stderr": 0.0325005368436584 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.047028804320496165, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.047028804320496165 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5379310344827586, "acc_stderr": 0.04154659671707548, "acc_norm": 0.5379310344827586, "acc_norm_stderr": 0.04154659671707548 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3915343915343915, "acc_stderr": 0.02513809138885111, "acc_norm": 0.3915343915343915, "acc_norm_stderr": 0.02513809138885111 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4444444444444444, "acc_stderr": 0.044444444444444495, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.044444444444444495 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7580645161290323, "acc_stderr": 0.024362599693031086, "acc_norm": 0.7580645161290323, "acc_norm_stderr": 0.024362599693031086 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.035179450386910616, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7878787878787878, "acc_stderr": 0.03192271569548301, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.03192271569548301 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7676767676767676, "acc_stderr": 0.030088629490217487, "acc_norm": 0.7676767676767676, "acc_norm_stderr": 0.030088629490217487 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8549222797927462, "acc_stderr": 0.025416343096306443, "acc_norm": 0.8549222797927462, "acc_norm_stderr": 0.025416343096306443 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6307692307692307, "acc_stderr": 0.024468615241478926, "acc_norm": 0.6307692307692307, "acc_norm_stderr": 0.024468615241478926 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.028742040903948492, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.028742040903948492 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6428571428571429, "acc_stderr": 0.031124619309328177, "acc_norm": 0.6428571428571429, "acc_norm_stderr": 0.031124619309328177 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.03861557546255169, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.03861557546255169 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8110091743119267, "acc_stderr": 0.016785481159203627, "acc_norm": 0.8110091743119267, "acc_norm_stderr": 0.016785481159203627 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.47685185185185186, "acc_stderr": 0.034063153607115065, "acc_norm": 0.47685185185185186, "acc_norm_stderr": 0.034063153607115065 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8088235294117647, "acc_stderr": 0.02759917430064076, "acc_norm": 0.8088235294117647, "acc_norm_stderr": 0.02759917430064076 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7510548523206751, "acc_stderr": 0.028146970599422644, "acc_norm": 0.7510548523206751, "acc_norm_stderr": 0.028146970599422644 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.672645739910314, "acc_stderr": 0.03149384670994131, "acc_norm": 0.672645739910314, "acc_norm_stderr": 0.03149384670994131 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7520661157024794, "acc_stderr": 0.039418975265163046, "acc_norm": 0.7520661157024794, "acc_norm_stderr": 0.039418975265163046 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252627, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252627 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7484662576687117, "acc_stderr": 0.03408997886857529, "acc_norm": 0.7484662576687117, "acc_norm_stderr": 0.03408997886857529 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.44642857142857145, "acc_stderr": 0.04718471485219588, "acc_norm": 0.44642857142857145, "acc_norm_stderr": 0.04718471485219588 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.04058042015646034, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.04058042015646034 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8589743589743589, "acc_stderr": 0.02280138253459753, "acc_norm": 0.8589743589743589, "acc_norm_stderr": 0.02280138253459753 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8033205619412516, "acc_stderr": 0.014214138556913915, "acc_norm": 0.8033205619412516, "acc_norm_stderr": 0.014214138556913915 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7196531791907514, "acc_stderr": 0.02418242749657761, "acc_norm": 0.7196531791907514, "acc_norm_stderr": 0.02418242749657761 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3094972067039106, "acc_stderr": 0.015461169002371537, "acc_norm": 0.3094972067039106, "acc_norm_stderr": 0.015461169002371537 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.738562091503268, "acc_stderr": 0.025160998214292456, "acc_norm": 0.738562091503268, "acc_norm_stderr": 0.025160998214292456 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.707395498392283, "acc_stderr": 0.02583989833487798, "acc_norm": 0.707395498392283, "acc_norm_stderr": 0.02583989833487798 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7098765432098766, "acc_stderr": 0.025251173936495036, "acc_norm": 0.7098765432098766, "acc_norm_stderr": 0.025251173936495036 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48936170212765956, "acc_stderr": 0.029820747191422473, "acc_norm": 0.48936170212765956, "acc_norm_stderr": 0.029820747191422473 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4491525423728814, "acc_stderr": 0.012704030518851488, "acc_norm": 0.4491525423728814, "acc_norm_stderr": 0.012704030518851488 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6727941176470589, "acc_stderr": 0.02850145286039656, "acc_norm": 0.6727941176470589, "acc_norm_stderr": 0.02850145286039656 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6535947712418301, "acc_stderr": 0.019249785691717206, "acc_norm": 0.6535947712418301, "acc_norm_stderr": 0.019249785691717206 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.726530612244898, "acc_stderr": 0.028535560337128445, "acc_norm": 0.726530612244898, "acc_norm_stderr": 0.028535560337128445 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8308457711442786, "acc_stderr": 0.026508590656233278, "acc_norm": 0.8308457711442786, "acc_norm_stderr": 0.026508590656233278 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.03379976689896308, "acc_norm": 0.87, "acc_norm_stderr": 0.03379976689896308 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8070175438596491, "acc_stderr": 0.030267457554898458, "acc_norm": 0.8070175438596491, "acc_norm_stderr": 0.030267457554898458 }, "harness|truthfulqa:mc|0": { "mc1": 0.32558139534883723, "mc1_stderr": 0.016403989469907825, "mc2": 0.488208971713914, "mc2_stderr": 0.01514417520712263 }, "harness|winogrande|5": { "acc": 0.77663772691397, "acc_stderr": 0.011705697565205201 }, "harness|gsm8k|5": { "acc": 0.3479909021986353, "acc_stderr": 0.013120581030382132 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_andysalerno__rainbowfish-7B-v9
[ "region:us" ]
2024-02-12T20:17:09+00:00
{"pretty_name": "Evaluation run of andysalerno/rainbowfish-7B-v9", "dataset_summary": "Dataset automatically created during the evaluation run of model [andysalerno/rainbowfish-7B-v9](https://huggingface.co/andysalerno/rainbowfish-7B-v9) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_andysalerno__rainbowfish-7B-v9\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T20:14:46.042064](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__rainbowfish-7B-v9/blob/main/results_2024-02-12T20-14-46.042064.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6270678144594366,\n \"acc_stderr\": 0.03256950382813242,\n \"acc_norm\": 0.6331149722051534,\n \"acc_norm_stderr\": 0.033232016922454914,\n \"mc1\": 0.32558139534883723,\n \"mc1_stderr\": 0.016403989469907825,\n \"mc2\": 0.488208971713914,\n \"mc2_stderr\": 0.01514417520712263\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5827645051194539,\n \"acc_stderr\": 0.01440982551840308,\n \"acc_norm\": 0.6177474402730375,\n \"acc_norm_stderr\": 0.014200454049979277\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6320454092810197,\n \"acc_stderr\": 0.004812633280078265,\n \"acc_norm\": 0.8243377813184625,\n \"acc_norm_stderr\": 0.003797548252851631\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013317,\n \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013317\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3915343915343915,\n \"acc_stderr\": 0.02513809138885111,\n \"acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.02513809138885111\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7580645161290323,\n \"acc_stderr\": 0.024362599693031086,\n \"acc_norm\": 0.7580645161290323,\n \"acc_norm_stderr\": 0.024362599693031086\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306443,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.024468615241478926,\n \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.024468615241478926\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948492,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948492\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8110091743119267,\n \"acc_stderr\": 0.016785481159203627,\n \"acc_norm\": 0.8110091743119267,\n \"acc_norm_stderr\": 0.016785481159203627\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.47685185185185186,\n \"acc_stderr\": 0.034063153607115065,\n \"acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.034063153607115065\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.02759917430064076,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.02759917430064076\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163046,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163046\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8033205619412516,\n \"acc_stderr\": 0.014214138556913915,\n \"acc_norm\": 0.8033205619412516,\n \"acc_norm_stderr\": 0.014214138556913915\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657761,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657761\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3094972067039106,\n \"acc_stderr\": 0.015461169002371537,\n \"acc_norm\": 0.3094972067039106,\n \"acc_norm_stderr\": 0.015461169002371537\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495036,\n \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495036\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4491525423728814,\n \"acc_stderr\": 0.012704030518851488,\n \"acc_norm\": 0.4491525423728814,\n \"acc_norm_stderr\": 0.012704030518851488\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.02850145286039656,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.02850145286039656\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6535947712418301,\n \"acc_stderr\": 0.019249785691717206,\n \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.019249785691717206\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128445,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128445\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233278,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233278\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32558139534883723,\n \"mc1_stderr\": 0.016403989469907825,\n \"mc2\": 0.488208971713914,\n \"mc2_stderr\": 0.01514417520712263\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.77663772691397,\n \"acc_stderr\": 0.011705697565205201\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3479909021986353,\n \"acc_stderr\": 0.013120581030382132\n }\n}\n```", "repo_url": "https://huggingface.co/andysalerno/rainbowfish-7B-v9", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|arc:challenge|25_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|gsm8k|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hellaswag|10_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T20-14-46.042064.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["**/details_harness|winogrande|5_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T20-14-46.042064.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T20_14_46.042064", "path": ["results_2024-02-12T20-14-46.042064.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T20-14-46.042064.parquet"]}]}]}
2024-02-12T20:17:33+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of andysalerno/rainbowfish-7B-v9 Dataset automatically created during the evaluation run of model andysalerno/rainbowfish-7B-v9 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-12T20:14:46.042064(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of andysalerno/rainbowfish-7B-v9\n\n\n\nDataset automatically created during the evaluation run of model andysalerno/rainbowfish-7B-v9 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T20:14:46.042064(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of andysalerno/rainbowfish-7B-v9\n\n\n\nDataset automatically created during the evaluation run of model andysalerno/rainbowfish-7B-v9 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T20:14:46.042064(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
5797be10bbb8e939662d91167fc1e412700e5bed
# Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> Tokenizer: mbert Dataset: TASTEset Unshuffled ratio: 0 Shuffled ratio: 1 Drop duplicates: True Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/EW-TT-PE.json ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
pgajo/EW-TT-PE_U0_S1_Tingredient_P1_DROP1_mbert
[ "region:us" ]
2024-02-12T20:17:49+00:00
{}
2024-02-12T20:17:56+00:00
[]
[]
TAGS #region-us
# Dataset Card for Dataset Name Tokenizer: mbert Dataset: TASTEset Unshuffled ratio: 0 Shuffled ratio: 1 Drop duplicates: True Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/URL ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mbert\n\n Dataset: TASTEset\n\n Unshuffled ratio: 0\n\n Shuffled ratio: 1\n\n Drop duplicates: True\n\n Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/URL", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mbert\n\n Dataset: TASTEset\n\n Unshuffled ratio: 0\n\n Shuffled ratio: 1\n\n Drop duplicates: True\n\n Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/URL", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
6cd9fe58f09358ba27f9febaf174367d4004daaf
0: "ENOUGH_INFO" 1: "NOT_ENOUGH_INFO"
iestynmullinor/fever_reranker_training
[ "region:us" ]
2024-02-12T20:22:15+00:00
{}
2024-02-12T20:50:06+00:00
[]
[]
TAGS #region-us
0: "ENOUGH_INFO" 1: "NOT_ENOUGH_INFO"
[]
[ "TAGS\n#region-us \n" ]
b8b16a95e7abf89c48f609cc2f3b90201a8c1643
# Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> Tokenizer: mdeberta Dataset: TASTEset Unshuffled ratio: 0 Shuffled ratio: 1 Drop duplicates: True Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/EW-TT-PE.json ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
pgajo/EW-TT-PE_U0_S1_Tingredient_P1_DROP1_mdeberta
[ "region:us" ]
2024-02-12T20:35:10+00:00
{}
2024-02-12T20:35:17+00:00
[]
[]
TAGS #region-us
# Dataset Card for Dataset Name Tokenizer: mdeberta Dataset: TASTEset Unshuffled ratio: 0 Shuffled ratio: 1 Drop duplicates: True Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/URL ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mdeberta\n\n Dataset: TASTEset\n\n Unshuffled ratio: 0\n\n Shuffled ratio: 1\n\n Drop duplicates: True\n\n Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/URL", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mdeberta\n\n Dataset: TASTEset\n\n Unshuffled ratio: 0\n\n Shuffled ratio: 1\n\n Drop duplicates: True\n\n Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/URL", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
f0d0881becb9f992f60eb6fca3012484b1b8af99
# Dataset consisting of polish jokes ## Warning: Jokes were not curated, some may be offensive, stupid or simply not funny. It's highly recommended to filter jokes before training, e.g., based on downvotes This dataset consists of all (9k) jokes dumped from [jeja.pl](https://dowcipy.jeja.pl/) on 2024-02-14. Jokes are submitted by the community. Besides _the funny_ text itself, I included upvotes and downvotes. You can use them for filtering. Default sorting is based on a combination of downvotes and upvotes. If used for training LLMs, it's recommended to use a tokenizer that supports line breaks, as these are often important for readability of the jokes. ## Where to find me - [Github](https://github.com/JonaszPotoniec) - [Linkedin](https://www.linkedin.com/in/jonasz-potoniec/) - [E-mail](mailto:[email protected]) - [Telegram](https://t.me/JonaszPotoniec)
JonaszPotoniec/dowcipy-polish-jokes-dataset
[ "task_categories:text-generation", "size_categories:1K<n<10K", "language:pl", "license:mit", "art", "region:us" ]
2024-02-12T20:56:06+00:00
{"language": ["pl"], "license": "mit", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation"], "pretty_name": "Dowcipy jaja", "dataset_info": {"features": [{"name": "joke", "dtype": "string"}, {"name": "upvotes", "dtype": "int64"}, {"name": "downvotes", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 3074127, "num_examples": 9020}], "download_size": 2061760, "dataset_size": 3074127}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["art"]}
2024-02-15T21:54:24+00:00
[]
[ "pl" ]
TAGS #task_categories-text-generation #size_categories-1K<n<10K #language-Polish #license-mit #art #region-us
# Dataset consisting of polish jokes ## Warning: Jokes were not curated, some may be offensive, stupid or simply not funny. It's highly recommended to filter jokes before training, e.g., based on downvotes This dataset consists of all (9k) jokes dumped from URL on 2024-02-14. Jokes are submitted by the community. Besides _the funny_ text itself, I included upvotes and downvotes. You can use them for filtering. Default sorting is based on a combination of downvotes and upvotes. If used for training LLMs, it's recommended to use a tokenizer that supports line breaks, as these are often important for readability of the jokes. ## Where to find me - Github - Linkedin - E-mail - Telegram
[ "# Dataset consisting of polish jokes", "## Warning: Jokes were not curated, some may be offensive, stupid or simply not funny. It's highly recommended to filter jokes before training, e.g., based on downvotes\n\nThis dataset consists of all (9k) jokes dumped from URL on 2024-02-14. Jokes are submitted by the community. Besides _the funny_ text itself, I included upvotes and downvotes. You can use them for filtering. \nDefault sorting is based on a combination of downvotes and upvotes. \nIf used for training LLMs, it's recommended to use a tokenizer that supports line breaks, as these are often important for readability of the jokes.", "## Where to find me\n\n- Github\n- Linkedin\n- E-mail\n- Telegram" ]
[ "TAGS\n#task_categories-text-generation #size_categories-1K<n<10K #language-Polish #license-mit #art #region-us \n", "# Dataset consisting of polish jokes", "## Warning: Jokes were not curated, some may be offensive, stupid or simply not funny. It's highly recommended to filter jokes before training, e.g., based on downvotes\n\nThis dataset consists of all (9k) jokes dumped from URL on 2024-02-14. Jokes are submitted by the community. Besides _the funny_ text itself, I included upvotes and downvotes. You can use them for filtering. \nDefault sorting is based on a combination of downvotes and upvotes. \nIf used for training LLMs, it's recommended to use a tokenizer that supports line breaks, as these are often important for readability of the jokes.", "## Where to find me\n\n- Github\n- Linkedin\n- E-mail\n- Telegram" ]
9aad1964c7dd25b4e7637cdac73b99c9f37ce2c1
# Hercules-v3.0 ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6437292ecd93f4c9a34b0d47/vjFdcoktjUqvjKcBrQKob.png) - **Dataset Name:** Hercules-v3.0 - **Version:** 3.0 - **Release Date:** 2024-2-14 - **Number of Examples:** 1,637,895 - **Domains:** Math, Science, Biology, Physics, Instruction Following, Conversation, Computer Science, Roleplay, and more - **Languages:** Mostly English, but others can be detected. - **Task Types:** Question Answering, Conversational Modeling, Instruction Following, Code Generation, Roleplay ## Data Source Description Hercules-v3.0 is an extensive and diverse dataset that combines various domains to create a powerful tool for training artificial intelligence models. The data sources include conversations, coding examples, scientific explanations, and more. The dataset is sourced from multiple high-quality repositories, each contributing to the robustness of Hercules-v3.0 in different knowledge domains. ## Included Data Sources - `cognitivecomputations/dolphin` - `Evol Instruct 70K & 140K` - `teknium/GPT4-LLM-Cleaned` - `jondurbin/airoboros-3.2` - `AlekseyKorshuk/camel-chatml` - `CollectiveCognition/chats-data-2023-09-22` - `Nebulous/lmsys-chat-1m-smortmodelsonly` - `glaiveai/glaive-code-assistant-v2` - `glaiveai/glaive-code-assistant` - `glaiveai/glaive-function-calling-v2` - `garage-bAInd/Open-Platypus` - `meta-math/MetaMathQA` - `teknium/GPTeacher-General-Instruct` - `GPTeacher roleplay datasets` - `BI55/MedText` - `pubmed_qa labeled subset` - `Unnatural Instructions` - `M4-ai/LDJnr_combined_inout_format` - `CollectiveCognition/chats-data-2023-09-27` - `CollectiveCognition/chats-data-2023-10-16` - `NobodyExistsOnTheInternet/sharegptPIPPA` - `yuekai/openchat_sharegpt_v3_vicuna_format` - `ise-uiuc/Magicoder-Evol-Instruct-110K` - `Squish42/bluemoon-fandom-1-1-rp-cleaned` - `sablo/oasst2_curated` ## Data Characteristics The dataset amalgamates text from various domains, including structured and unstructured data. It contains dialogues, instructional texts, scientific explanations, coding tasks, and more. ## Intended Use Hercules-v3.0 is designed for training and evaluating AI models capable of handling complex tasks across multiple domains. It is suitable for researchers and developers in academia and industry working on advanced conversational agents, instruction-following models, and knowledge-intensive applications. ## Data Quality The data was collected from reputable sources with an emphasis on diversity and quality. It is expected to be relatively clean but may require additional preprocessing for specific tasks. ## Limitations and Bias - The dataset may have inherent biases from the original data sources. - Some domains may be overrepresented due to the nature of the source datasets. ## X-rated Content Disclaimer Hercules-v3.0 contains X-rated content. Users are solely responsible for the use of the dataset and must ensure that their use complies with all applicable laws and regulations. The dataset maintainers are not responsible for the misuse of the dataset. ## Usage Agreement By using the Hercules-v3.0 dataset, users agree to the following: - The dataset is used at the user's own risk. - The dataset maintainers are not liable for any damages arising from the use of the dataset. - Users will not hold the dataset maintainers responsible for any claims, liabilities, losses, or expenses. Please make sure to read the license for more information. ## Citation ``` @misc{sebastian_gabarain_2024, title = {Hercules-v3.0: The "Golden Ratio" for High Quality Instruction Datasets}, author = {Sebastian Gabarain}, publisher = {HuggingFace}, year = {2024}, url = {https://huggingface.co/datasets/Locutusque/Hercules-v3.0} } ```
Locutusque/Hercules-v3.0
[ "task_categories:text-generation", "task_categories:question-answering", "task_categories:conversational", "language:en", "license:other", "region:us" ]
2024-02-12T21:03:00+00:00
{"language": ["en"], "license": "other", "task_categories": ["text-generation", "question-answering", "conversational"]}
2024-02-16T22:26:02+00:00
[]
[ "en" ]
TAGS #task_categories-text-generation #task_categories-question-answering #task_categories-conversational #language-English #license-other #region-us
# Hercules-v3.0 !image/png - Dataset Name: Hercules-v3.0 - Version: 3.0 - Release Date: 2024-2-14 - Number of Examples: 1,637,895 - Domains: Math, Science, Biology, Physics, Instruction Following, Conversation, Computer Science, Roleplay, and more - Languages: Mostly English, but others can be detected. - Task Types: Question Answering, Conversational Modeling, Instruction Following, Code Generation, Roleplay ## Data Source Description Hercules-v3.0 is an extensive and diverse dataset that combines various domains to create a powerful tool for training artificial intelligence models. The data sources include conversations, coding examples, scientific explanations, and more. The dataset is sourced from multiple high-quality repositories, each contributing to the robustness of Hercules-v3.0 in different knowledge domains. ## Included Data Sources - 'cognitivecomputations/dolphin' - 'Evol Instruct 70K & 140K' - 'teknium/GPT4-LLM-Cleaned' - 'jondurbin/airoboros-3.2' - 'AlekseyKorshuk/camel-chatml' - 'CollectiveCognition/chats-data-2023-09-22' - 'Nebulous/lmsys-chat-1m-smortmodelsonly' - 'glaiveai/glaive-code-assistant-v2' - 'glaiveai/glaive-code-assistant' - 'glaiveai/glaive-function-calling-v2' - 'garage-bAInd/Open-Platypus' - 'meta-math/MetaMathQA' - 'teknium/GPTeacher-General-Instruct' - 'GPTeacher roleplay datasets' - 'BI55/MedText' - 'pubmed_qa labeled subset' - 'Unnatural Instructions' - 'M4-ai/LDJnr_combined_inout_format' - 'CollectiveCognition/chats-data-2023-09-27' - 'CollectiveCognition/chats-data-2023-10-16' - 'NobodyExistsOnTheInternet/sharegptPIPPA' - 'yuekai/openchat_sharegpt_v3_vicuna_format' - 'ise-uiuc/Magicoder-Evol-Instruct-110K' - 'Squish42/bluemoon-fandom-1-1-rp-cleaned' - 'sablo/oasst2_curated' ## Data Characteristics The dataset amalgamates text from various domains, including structured and unstructured data. It contains dialogues, instructional texts, scientific explanations, coding tasks, and more. ## Intended Use Hercules-v3.0 is designed for training and evaluating AI models capable of handling complex tasks across multiple domains. It is suitable for researchers and developers in academia and industry working on advanced conversational agents, instruction-following models, and knowledge-intensive applications. ## Data Quality The data was collected from reputable sources with an emphasis on diversity and quality. It is expected to be relatively clean but may require additional preprocessing for specific tasks. ## Limitations and Bias - The dataset may have inherent biases from the original data sources. - Some domains may be overrepresented due to the nature of the source datasets. ## X-rated Content Disclaimer Hercules-v3.0 contains X-rated content. Users are solely responsible for the use of the dataset and must ensure that their use complies with all applicable laws and regulations. The dataset maintainers are not responsible for the misuse of the dataset. ## Usage Agreement By using the Hercules-v3.0 dataset, users agree to the following: - The dataset is used at the user's own risk. - The dataset maintainers are not liable for any damages arising from the use of the dataset. - Users will not hold the dataset maintainers responsible for any claims, liabilities, losses, or expenses. Please make sure to read the license for more information.
[ "# Hercules-v3.0\n\n!image/png\n\n- Dataset Name: Hercules-v3.0\n- Version: 3.0\n- Release Date: 2024-2-14\n- Number of Examples: 1,637,895\n- Domains: Math, Science, Biology, Physics, Instruction Following, Conversation, Computer Science, Roleplay, and more\n- Languages: Mostly English, but others can be detected.\n- Task Types: Question Answering, Conversational Modeling, Instruction Following, Code Generation, Roleplay", "## Data Source Description\nHercules-v3.0 is an extensive and diverse dataset that combines various domains to create a powerful tool for training artificial intelligence models. The data sources include conversations, coding examples, scientific explanations, and more. The dataset is sourced from multiple high-quality repositories, each contributing to the robustness of Hercules-v3.0 in different knowledge domains.", "## Included Data Sources\n- 'cognitivecomputations/dolphin'\n- 'Evol Instruct 70K & 140K'\n- 'teknium/GPT4-LLM-Cleaned'\n- 'jondurbin/airoboros-3.2'\n- 'AlekseyKorshuk/camel-chatml'\n- 'CollectiveCognition/chats-data-2023-09-22'\n- 'Nebulous/lmsys-chat-1m-smortmodelsonly'\n- 'glaiveai/glaive-code-assistant-v2'\n- 'glaiveai/glaive-code-assistant'\n- 'glaiveai/glaive-function-calling-v2'\n- 'garage-bAInd/Open-Platypus'\n- 'meta-math/MetaMathQA'\n- 'teknium/GPTeacher-General-Instruct'\n- 'GPTeacher roleplay datasets'\n- 'BI55/MedText'\n- 'pubmed_qa labeled subset'\n- 'Unnatural Instructions'\n- 'M4-ai/LDJnr_combined_inout_format'\n- 'CollectiveCognition/chats-data-2023-09-27'\n- 'CollectiveCognition/chats-data-2023-10-16'\n- 'NobodyExistsOnTheInternet/sharegptPIPPA'\n- 'yuekai/openchat_sharegpt_v3_vicuna_format'\n- 'ise-uiuc/Magicoder-Evol-Instruct-110K'\n- 'Squish42/bluemoon-fandom-1-1-rp-cleaned'\n- 'sablo/oasst2_curated'", "## Data Characteristics\nThe dataset amalgamates text from various domains, including structured and unstructured data. It contains dialogues, instructional texts, scientific explanations, coding tasks, and more.", "## Intended Use\nHercules-v3.0 is designed for training and evaluating AI models capable of handling complex tasks across multiple domains. It is suitable for researchers and developers in academia and industry working on advanced conversational agents, instruction-following models, and knowledge-intensive applications.", "## Data Quality\nThe data was collected from reputable sources with an emphasis on diversity and quality. It is expected to be relatively clean but may require additional preprocessing for specific tasks.", "## Limitations and Bias\n- The dataset may have inherent biases from the original data sources.\n- Some domains may be overrepresented due to the nature of the source datasets.", "## X-rated Content Disclaimer\nHercules-v3.0 contains X-rated content. Users are solely responsible for the use of the dataset and must ensure that their use complies with all applicable laws and regulations. The dataset maintainers are not responsible for the misuse of the dataset.", "## Usage Agreement\nBy using the Hercules-v3.0 dataset, users agree to the following:\n- The dataset is used at the user's own risk.\n- The dataset maintainers are not liable for any damages arising from the use of the dataset.\n- Users will not hold the dataset maintainers responsible for any claims, liabilities, losses, or expenses.\n\nPlease make sure to read the license for more information." ]
[ "TAGS\n#task_categories-text-generation #task_categories-question-answering #task_categories-conversational #language-English #license-other #region-us \n", "# Hercules-v3.0\n\n!image/png\n\n- Dataset Name: Hercules-v3.0\n- Version: 3.0\n- Release Date: 2024-2-14\n- Number of Examples: 1,637,895\n- Domains: Math, Science, Biology, Physics, Instruction Following, Conversation, Computer Science, Roleplay, and more\n- Languages: Mostly English, but others can be detected.\n- Task Types: Question Answering, Conversational Modeling, Instruction Following, Code Generation, Roleplay", "## Data Source Description\nHercules-v3.0 is an extensive and diverse dataset that combines various domains to create a powerful tool for training artificial intelligence models. The data sources include conversations, coding examples, scientific explanations, and more. The dataset is sourced from multiple high-quality repositories, each contributing to the robustness of Hercules-v3.0 in different knowledge domains.", "## Included Data Sources\n- 'cognitivecomputations/dolphin'\n- 'Evol Instruct 70K & 140K'\n- 'teknium/GPT4-LLM-Cleaned'\n- 'jondurbin/airoboros-3.2'\n- 'AlekseyKorshuk/camel-chatml'\n- 'CollectiveCognition/chats-data-2023-09-22'\n- 'Nebulous/lmsys-chat-1m-smortmodelsonly'\n- 'glaiveai/glaive-code-assistant-v2'\n- 'glaiveai/glaive-code-assistant'\n- 'glaiveai/glaive-function-calling-v2'\n- 'garage-bAInd/Open-Platypus'\n- 'meta-math/MetaMathQA'\n- 'teknium/GPTeacher-General-Instruct'\n- 'GPTeacher roleplay datasets'\n- 'BI55/MedText'\n- 'pubmed_qa labeled subset'\n- 'Unnatural Instructions'\n- 'M4-ai/LDJnr_combined_inout_format'\n- 'CollectiveCognition/chats-data-2023-09-27'\n- 'CollectiveCognition/chats-data-2023-10-16'\n- 'NobodyExistsOnTheInternet/sharegptPIPPA'\n- 'yuekai/openchat_sharegpt_v3_vicuna_format'\n- 'ise-uiuc/Magicoder-Evol-Instruct-110K'\n- 'Squish42/bluemoon-fandom-1-1-rp-cleaned'\n- 'sablo/oasst2_curated'", "## Data Characteristics\nThe dataset amalgamates text from various domains, including structured and unstructured data. It contains dialogues, instructional texts, scientific explanations, coding tasks, and more.", "## Intended Use\nHercules-v3.0 is designed for training and evaluating AI models capable of handling complex tasks across multiple domains. It is suitable for researchers and developers in academia and industry working on advanced conversational agents, instruction-following models, and knowledge-intensive applications.", "## Data Quality\nThe data was collected from reputable sources with an emphasis on diversity and quality. It is expected to be relatively clean but may require additional preprocessing for specific tasks.", "## Limitations and Bias\n- The dataset may have inherent biases from the original data sources.\n- Some domains may be overrepresented due to the nature of the source datasets.", "## X-rated Content Disclaimer\nHercules-v3.0 contains X-rated content. Users are solely responsible for the use of the dataset and must ensure that their use complies with all applicable laws and regulations. The dataset maintainers are not responsible for the misuse of the dataset.", "## Usage Agreement\nBy using the Hercules-v3.0 dataset, users agree to the following:\n- The dataset is used at the user's own risk.\n- The dataset maintainers are not liable for any damages arising from the use of the dataset.\n- Users will not hold the dataset maintainers responsible for any claims, liabilities, losses, or expenses.\n\nPlease make sure to read the license for more information." ]
973c22698dfe0688c81f683686cc28f9aca69b69
# Dataset Card for "live_ATC_QTR908" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
adityarra07/live_ATC_QTR908
[ "region:us" ]
2024-02-12T21:05:00+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "audio", "dtype": {"audio": {"sampling_rate": 16000}}}], "splits": [{"name": "train", "num_bytes": 5256119.0, "num_examples": 14}], "download_size": 4041367, "dataset_size": 5256119.0}}
2024-02-12T21:05:02+00:00
[]
[]
TAGS #region-us
# Dataset Card for "live_ATC_QTR908" More Information needed
[ "# Dataset Card for \"live_ATC_QTR908\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"live_ATC_QTR908\"\n\nMore Information needed" ]
ccdfd904bf737bfd7ce6fab9e629adbfa8391bd1
# Dataset Card for "live_ATC_ENY3808" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
adityarra07/live_ATC_ENY3808
[ "region:us" ]
2024-02-12T21:05:18+00:00
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "audio", "dtype": {"audio": {"sampling_rate": 16000}}}], "splits": [{"name": "train", "num_bytes": 15757515.0, "num_examples": 15}], "download_size": 13605142, "dataset_size": 15757515.0}}
2024-02-12T21:05:20+00:00
[]
[]
TAGS #region-us
# Dataset Card for "live_ATC_ENY3808" More Information needed
[ "# Dataset Card for \"live_ATC_ENY3808\"\n\nMore Information needed" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"live_ATC_ENY3808\"\n\nMore Information needed" ]
572e237b6682e41dd4c37761e248057763e654b0
# Budapest-v0.1 Dataset README ## Overview The Budapest-v0.1 dataset is a cutting-edge resource specifically designed for fine-tuning large language models (LLMs). Created using GPT-4, this dataset is presented in a message-response format, making it particularly suitable for a variety of natural language processing tasks. The primary focus of Budapest-v0.1 is to aid in the development and enhancement of algorithms capable of performing summarization, question answering, writing messages, and addressing open-ended questions. This dataset is fully Hungarian, catering to the development of Hungarian language models or the enhancement of multilingual models with Hungarian language capabilities. ## Dataset Composition - **Format**: The dataset is structured in a message-response style, where each entry consists of an input message followed by the model-generated response. This format is particularly useful for training models on conversational tasks and other natural language understanding and generation challenges. - **Input**: The input message provides context or a prompt for the model to respond to. - **Output**: The model-generated response is the output of the language model, providing a completion or answer to the input message. - **Category**: The dataset is designed to support a variety of tasks, including text generation, conversational modeling, and question answering. - **Language**: Fully in Hungarian, Budapest-v0.1 provides a valuable resource for Hungarian natural language processing tasks, contributing to the diversity of language representation in AI models. - **Generation**: Entirely generated by GPT-4, ensuring high-quality, contextually relevant, and syntactically varied data entries that can be used to fine-tune models for improved performance on specified tasks. ## Intended Use Cases The dataset is tailored for several key tasks in natural language processing: - **Summary**: Training models to condense information into concise summaries, capturing the essence of messages or documents. - **Question Answering**: Enhancing the ability of models to understand and accurately respond to questions based on provided or inferred information. - **Writing Messages**: Improving model performance in generating coherent, contextually appropriate messages in various formats (e.g., emails, chat responses). - **Open-Ended Questions**: Enabling models to handle and respond to open-ended queries, fostering creative and contextually relevant outputs. ## Testing and Experimentation Budapest-v0.1 is intended for testing and experimental purposes. Researchers and developers are encouraged to use this dataset to test the capabilities of their models, explore the nuances of language understanding and generation, and innovate in the realm of Hungarian natural language processing. ## Future Directions While Budapest-v0.1 is currently focused on supporting a select set of tasks in Hungarian, there is potential for expansion. Future versions may include a broader range of tasks, cover additional languages, or provide more diverse data types to support a wider array of NLP applications. ## Contribution and Feedback Contributions to the dataset and feedback on its use are welcome. Researchers and developers are encouraged to share their findings, suggest improvements, and discuss potential expansions that could enhance the dataset's utility for the NLP community. ## Note This README was also generated by GPT-4.
Bazsalanszky/budapest-v0.1-hun
[ "task_categories:text-generation", "task_categories:conversational", "task_categories:question-answering", "size_categories:1K<n<10K", "language:hu", "license:apache-2.0", "gpt4", "hungarian", "instruction-finetuning", "region:us" ]
2024-02-12T21:11:39+00:00
{"language": ["hu"], "license": "apache-2.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-generation", "conversational", "question-answering"], "dataset_info": {"features": [{"name": "output", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "category", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4987359, "num_examples": 1146}], "download_size": 2717237, "dataset_size": 4987359}, "tags": ["gpt4", "hungarian", "instruction-finetuning"], "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-02-13T10:44:01+00:00
[]
[ "hu" ]
TAGS #task_categories-text-generation #task_categories-conversational #task_categories-question-answering #size_categories-1K<n<10K #language-Hungarian #license-apache-2.0 #gpt4 #hungarian #instruction-finetuning #region-us
# Budapest-v0.1 Dataset README ## Overview The Budapest-v0.1 dataset is a cutting-edge resource specifically designed for fine-tuning large language models (LLMs). Created using GPT-4, this dataset is presented in a message-response format, making it particularly suitable for a variety of natural language processing tasks. The primary focus of Budapest-v0.1 is to aid in the development and enhancement of algorithms capable of performing summarization, question answering, writing messages, and addressing open-ended questions. This dataset is fully Hungarian, catering to the development of Hungarian language models or the enhancement of multilingual models with Hungarian language capabilities. ## Dataset Composition - Format: The dataset is structured in a message-response style, where each entry consists of an input message followed by the model-generated response. This format is particularly useful for training models on conversational tasks and other natural language understanding and generation challenges. - Input: The input message provides context or a prompt for the model to respond to. - Output: The model-generated response is the output of the language model, providing a completion or answer to the input message. - Category: The dataset is designed to support a variety of tasks, including text generation, conversational modeling, and question answering. - Language: Fully in Hungarian, Budapest-v0.1 provides a valuable resource for Hungarian natural language processing tasks, contributing to the diversity of language representation in AI models. - Generation: Entirely generated by GPT-4, ensuring high-quality, contextually relevant, and syntactically varied data entries that can be used to fine-tune models for improved performance on specified tasks. ## Intended Use Cases The dataset is tailored for several key tasks in natural language processing: - Summary: Training models to condense information into concise summaries, capturing the essence of messages or documents. - Question Answering: Enhancing the ability of models to understand and accurately respond to questions based on provided or inferred information. - Writing Messages: Improving model performance in generating coherent, contextually appropriate messages in various formats (e.g., emails, chat responses). - Open-Ended Questions: Enabling models to handle and respond to open-ended queries, fostering creative and contextually relevant outputs. ## Testing and Experimentation Budapest-v0.1 is intended for testing and experimental purposes. Researchers and developers are encouraged to use this dataset to test the capabilities of their models, explore the nuances of language understanding and generation, and innovate in the realm of Hungarian natural language processing. ## Future Directions While Budapest-v0.1 is currently focused on supporting a select set of tasks in Hungarian, there is potential for expansion. Future versions may include a broader range of tasks, cover additional languages, or provide more diverse data types to support a wider array of NLP applications. ## Contribution and Feedback Contributions to the dataset and feedback on its use are welcome. Researchers and developers are encouraged to share their findings, suggest improvements, and discuss potential expansions that could enhance the dataset's utility for the NLP community. ## Note This README was also generated by GPT-4.
[ "# Budapest-v0.1 Dataset README", "## Overview\n\nThe Budapest-v0.1 dataset is a cutting-edge resource specifically designed for fine-tuning large language models (LLMs). Created using GPT-4, this dataset is presented in a message-response format, making it particularly suitable for a variety of natural language processing tasks. The primary focus of Budapest-v0.1 is to aid in the development and enhancement of algorithms capable of performing summarization, question answering, writing messages, and addressing open-ended questions. This dataset is fully Hungarian, catering to the development of Hungarian language models or the enhancement of multilingual models with Hungarian language capabilities.", "## Dataset Composition\n\n- Format: The dataset is structured in a message-response style, where each entry consists of an input message followed by the model-generated response. This format is particularly useful for training models on conversational tasks and other natural language understanding and generation challenges.\n \n - Input: The input message provides context or a prompt for the model to respond to.\n \n - Output: The model-generated response is the output of the language model, providing a completion or answer to the input message.\n\n - Category: The dataset is designed to support a variety of tasks, including text generation, conversational modeling, and question answering.\n\n- Language: Fully in Hungarian, Budapest-v0.1 provides a valuable resource for Hungarian natural language processing tasks, contributing to the diversity of language representation in AI models.\n\n- Generation: Entirely generated by GPT-4, ensuring high-quality, contextually relevant, and syntactically varied data entries that can be used to fine-tune models for improved performance on specified tasks.", "## Intended Use Cases\n\nThe dataset is tailored for several key tasks in natural language processing:\n\n- Summary: Training models to condense information into concise summaries, capturing the essence of messages or documents.\n \n- Question Answering: Enhancing the ability of models to understand and accurately respond to questions based on provided or inferred information.\n \n- Writing Messages: Improving model performance in generating coherent, contextually appropriate messages in various formats (e.g., emails, chat responses).\n \n- Open-Ended Questions: Enabling models to handle and respond to open-ended queries, fostering creative and contextually relevant outputs.", "## Testing and Experimentation\n\nBudapest-v0.1 is intended for testing and experimental purposes. Researchers and developers are encouraged to use this dataset to test the capabilities of their models, explore the nuances of language understanding and generation, and innovate in the realm of Hungarian natural language processing.", "## Future Directions\n\nWhile Budapest-v0.1 is currently focused on supporting a select set of tasks in Hungarian, there is potential for expansion. Future versions may include a broader range of tasks, cover additional languages, or provide more diverse data types to support a wider array of NLP applications.", "## Contribution and Feedback\n\nContributions to the dataset and feedback on its use are welcome. Researchers and developers are encouraged to share their findings, suggest improvements, and discuss potential expansions that could enhance the dataset's utility for the NLP community.", "## Note\n\nThis README was also generated by GPT-4." ]
[ "TAGS\n#task_categories-text-generation #task_categories-conversational #task_categories-question-answering #size_categories-1K<n<10K #language-Hungarian #license-apache-2.0 #gpt4 #hungarian #instruction-finetuning #region-us \n", "# Budapest-v0.1 Dataset README", "## Overview\n\nThe Budapest-v0.1 dataset is a cutting-edge resource specifically designed for fine-tuning large language models (LLMs). Created using GPT-4, this dataset is presented in a message-response format, making it particularly suitable for a variety of natural language processing tasks. The primary focus of Budapest-v0.1 is to aid in the development and enhancement of algorithms capable of performing summarization, question answering, writing messages, and addressing open-ended questions. This dataset is fully Hungarian, catering to the development of Hungarian language models or the enhancement of multilingual models with Hungarian language capabilities.", "## Dataset Composition\n\n- Format: The dataset is structured in a message-response style, where each entry consists of an input message followed by the model-generated response. This format is particularly useful for training models on conversational tasks and other natural language understanding and generation challenges.\n \n - Input: The input message provides context or a prompt for the model to respond to.\n \n - Output: The model-generated response is the output of the language model, providing a completion or answer to the input message.\n\n - Category: The dataset is designed to support a variety of tasks, including text generation, conversational modeling, and question answering.\n\n- Language: Fully in Hungarian, Budapest-v0.1 provides a valuable resource for Hungarian natural language processing tasks, contributing to the diversity of language representation in AI models.\n\n- Generation: Entirely generated by GPT-4, ensuring high-quality, contextually relevant, and syntactically varied data entries that can be used to fine-tune models for improved performance on specified tasks.", "## Intended Use Cases\n\nThe dataset is tailored for several key tasks in natural language processing:\n\n- Summary: Training models to condense information into concise summaries, capturing the essence of messages or documents.\n \n- Question Answering: Enhancing the ability of models to understand and accurately respond to questions based on provided or inferred information.\n \n- Writing Messages: Improving model performance in generating coherent, contextually appropriate messages in various formats (e.g., emails, chat responses).\n \n- Open-Ended Questions: Enabling models to handle and respond to open-ended queries, fostering creative and contextually relevant outputs.", "## Testing and Experimentation\n\nBudapest-v0.1 is intended for testing and experimental purposes. Researchers and developers are encouraged to use this dataset to test the capabilities of their models, explore the nuances of language understanding and generation, and innovate in the realm of Hungarian natural language processing.", "## Future Directions\n\nWhile Budapest-v0.1 is currently focused on supporting a select set of tasks in Hungarian, there is potential for expansion. Future versions may include a broader range of tasks, cover additional languages, or provide more diverse data types to support a wider array of NLP applications.", "## Contribution and Feedback\n\nContributions to the dataset and feedback on its use are welcome. Researchers and developers are encouraged to share their findings, suggest improvements, and discuss potential expansions that could enhance the dataset's utility for the NLP community.", "## Note\n\nThis README was also generated by GPT-4." ]
5ffad03f8bb9e289072ba01ffcb0e25e8d16cb93
# Yukari (Princess Connect! Re:Dive) Dataset of Yukari from Princess Connect! Re:Dive, containing 188 images and captions in .txt files, based on the Dreambooth caption method. Main tags found on the dataset are: Default Outfit:(yukaridef, hat, cross earrings, blue dress, orange ascot, white shirt, white gloves) Summer Outfit:(yukarisu, white headwear, swimsuit, striped bikini, shirt, open clothes, skirt) Camp Outfit:(yukariadv, cleavage, fur trim, white jacket, white top, navel, black gloves, denim shorts, short shorts, belt, black thighhighs, single thighhigh, thigh strap) Images are crawled from many sites (e.g. danbooru, gelbooru, pixiv, etc.) ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | priconneYukari | 188 | 179 MiB | [Download](https://huggingface.co/datasets/Hunko/PriconneYukari-Dataset/resolve/main/priconneYukari.zip) | IMG+TXT | Dataset containing 4 subfolder with 188 images + .txt caption files | ### Disclaimer - This dataset is intented to be used in generative AI - text-to-image models, it was created with the intended purpose of making a Stable-diffusion LoRA model. - the dataset was built upon the Dreambooth caption method, the dataset follows this structure: ``` kanzakiHideri.zip / ├── dataset/ │ ├── norm/ │ │ ├── 0330b7cabafa5681c94f384ce27ea6b8_sayika_yukari_(princess_connect!).png │ │ ├── 0330b7cabafa5681c94f384ce27ea6b8_sayika_yukari_(princess_connect!).txt │ │ ├── 0c50f0b90cbf88a2c58a4a57e9f863d7_onikokko_yukari_(princess_connect!).png │ │ └── ... │ ├── yukaridef/ │ │ ├── 16fae665afb3738521a676de6b7c3ea6_tissuebox_(artist)_yukari_(princess_connect!)+yuuki_(princess_connect!).png │ │ ├── 16fae665afb3738521a676de6b7c3ea6_tissuebox_(artist)_yukari_(princess_connect!)+yuuki_(princess_connect!).txt │ │ ├── 1a7d33d96f3756c3ca8c344b245f61b2_omoomomo_yukari_(princess_connect!).png │ │ └── ... │ ├── yukarisu/ │ │ ├── 016326fc96f8512edca7ea2043af6082_sonchi_yukari_(princess_connect!).png │ │ ├── 016326fc96f8512edca7ea2043af6082_sonchi_yukari_(princess_connect!).txt │ │ ├── 0ec0d904d4d00e6f487a5eeb55bc5bcb_asahi_yanagi_akino_(princess_connect!)+mifuyu_(princess_connect!)+tamaki_(princess_connect!)+yukari_(princess_connect!).png │ │ └── ... │ ├── yukariadv/ │ │ ├── 084cc57706932ab22761e8cc62574c96_fuyu_(pixiv4365700)_yukari_(princess_connect!).png │ │ ├── 084cc57706932ab22761e8cc62574c96_fuyu_(pixiv4365700)_yukari_(princess_connect!).txt │ │ ├── 1a31926561b7db4e73eef8553711236f_moguru_yukari_(princess_connect!)+yuuki_(princess_connect!).png │ │ └── ... └── / ``` # License This dataset is provided under the [Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0)](https://creativecommons.org/licenses/by-nc/4.0/) license.
Hunko/PriconneYukari-Dataset
[ "task_categories:text-to-image", "size_categories:n<1K", "license:cc-by-4.0", "art", "not-for-all-audiences", "region:us" ]
2024-02-12T21:29:48+00:00
{"license": "cc-by-4.0", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "pretty_name": "Princess Connect! Yukari Dataset", "tags": ["art", "not-for-all-audiences"]}
2024-02-12T21:47:48+00:00
[]
[]
TAGS #task_categories-text-to-image #size_categories-n<1K #license-cc-by-4.0 #art #not-for-all-audiences #region-us
Yukari (Princess Connect! Re:Dive) ================================== Dataset of Yukari from Princess Connect! Re:Dive, containing 188 images and captions in .txt files, based on the Dreambooth caption method. Main tags found on the dataset are: Default Outfit:(yukaridef, hat, cross earrings, blue dress, orange ascot, white shirt, white gloves) Summer Outfit:(yukarisu, white headwear, swimsuit, striped bikini, shirt, open clothes, skirt) Camp Outfit:(yukariadv, cleavage, fur trim, white jacket, white top, navel, black gloves, denim shorts, short shorts, belt, black thighhighs, single thighhigh, thigh strap) Images are crawled from many sites (e.g. danbooru, gelbooru, pixiv, etc.) List of Packages ---------------- ### Disclaimer * This dataset is intented to be used in generative AI - text-to-image models, it was created with the intended purpose of making a Stable-diffusion LoRA model. * the dataset was built upon the Dreambooth caption method, the dataset follows this structure: License ======= This dataset is provided under the Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) license.
[ "### Disclaimer\n\n\n* This dataset is intented to be used in generative AI - text-to-image models, it was created with the intended purpose of making a Stable-diffusion LoRA model.\n* the dataset was built upon the Dreambooth caption method, the dataset follows this structure:\n\n\nLicense\n=======\n\n\nThis dataset is provided under the Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) license." ]
[ "TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-cc-by-4.0 #art #not-for-all-audiences #region-us \n", "### Disclaimer\n\n\n* This dataset is intented to be used in generative AI - text-to-image models, it was created with the intended purpose of making a Stable-diffusion LoRA model.\n* the dataset was built upon the Dreambooth caption method, the dataset follows this structure:\n\n\nLicense\n=======\n\n\nThis dataset is provided under the Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) license." ]
ad5301ea9861d22288262549610c1d698c862e0e
--- license: mit task_categories: - text-classification language: - zh size_categories: - 1K<n<10K --- # Dataset Card for Educational Course Descriptions Dataset ## Dataset Details ### Dataset Description - **Curated by:** Unknown - **Language(s) (NLP):** chinese - **License:** MIT This dataset comprises a diverse collection of educational course descriptions from various fields, including medicine, engineering, social sciences, and more. Each record contains an identification number, a course label, a detailed text describing the course, and the subject category. The texts offer insights into course content, objectives, and distinctive features, emphasizing each course's relevance and application in its respective field. ## Uses ### Direct Use The dataset is ideal for analyzing and categorizing educational content, understanding curriculum structures, and developing educational resources or recommendation systems. It can be used in natural language processing to train models for text classification, summarization, and recommendation systems. ### Out-of-Scope Use Misuse could include using the data for purposes other than educational analysis, such as commercial exploitation without proper licensing or using it to promote biased views in education. ## Dataset Structure - **ID:** Unique identification number for each course - **Label:** Name or title of the course - **Text:** Description of the course including objectives and content - **Subject Category:** Field or domain to which the course belongs ## Dataset Creation ### Curation Rationale The dataset is curated to provide a comprehensive overview of various educational courses, which can be instrumental in academic research, curriculum development, and educational analysis. ### Source Data #### Data Collection and Processing Details on the data collection and processing methodologies are not provided. ## Bias, Risks, and Limitations Users should be aware of potential biases in the representation of courses or subjects. The dataset's applicability might be limited by its scope, the diversity of courses, and the depth of descriptions. ### Recommendations Further information is needed to provide specific recommendations regarding the dataset's use, taking into account its biases, risks, and limitations. ## More Information Additional details about the dataset's origin, curation, and intended use are required for a comprehensive understanding and responsible application.
Xuehang/hi_smartedu_courses_datasets
[ "region:us" ]
2024-02-12T21:48:25+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": ["train.json"]}, {"split": "test", "path": ["test.json"]}]}]}
2024-02-15T17:10:04+00:00
[]
[]
TAGS #region-us
--- license: mit task_categories: - text-classification language: - zh size_categories: - 1K<n<10K --- # Dataset Card for Educational Course Descriptions Dataset ## Dataset Details ### Dataset Description - Curated by: Unknown - Language(s) (NLP): chinese - License: MIT This dataset comprises a diverse collection of educational course descriptions from various fields, including medicine, engineering, social sciences, and more. Each record contains an identification number, a course label, a detailed text describing the course, and the subject category. The texts offer insights into course content, objectives, and distinctive features, emphasizing each course's relevance and application in its respective field. ## Uses ### Direct Use The dataset is ideal for analyzing and categorizing educational content, understanding curriculum structures, and developing educational resources or recommendation systems. It can be used in natural language processing to train models for text classification, summarization, and recommendation systems. ### Out-of-Scope Use Misuse could include using the data for purposes other than educational analysis, such as commercial exploitation without proper licensing or using it to promote biased views in education. ## Dataset Structure - ID: Unique identification number for each course - Label: Name or title of the course - Text: Description of the course including objectives and content - Subject Category: Field or domain to which the course belongs ## Dataset Creation ### Curation Rationale The dataset is curated to provide a comprehensive overview of various educational courses, which can be instrumental in academic research, curriculum development, and educational analysis. ### Source Data #### Data Collection and Processing Details on the data collection and processing methodologies are not provided. ## Bias, Risks, and Limitations Users should be aware of potential biases in the representation of courses or subjects. The dataset's applicability might be limited by its scope, the diversity of courses, and the depth of descriptions. ### Recommendations Further information is needed to provide specific recommendations regarding the dataset's use, taking into account its biases, risks, and limitations. ## More Information Additional details about the dataset's origin, curation, and intended use are required for a comprehensive understanding and responsible application.
[ "# Dataset Card for Educational Course Descriptions Dataset", "## Dataset Details", "### Dataset Description\n\n- Curated by: Unknown\n- Language(s) (NLP): chinese\n- License: MIT\n\nThis dataset comprises a diverse collection of educational course descriptions from various fields, including medicine, engineering, social sciences, and more. Each record contains an identification number, a course label, a detailed text describing the course, and the subject category. The texts offer insights into course content, objectives, and distinctive features, emphasizing each course's relevance and application in its respective field.", "## Uses", "### Direct Use\n\nThe dataset is ideal for analyzing and categorizing educational content, understanding curriculum structures, and developing educational resources or recommendation systems. It can be used in natural language processing to train models for text classification, summarization, and recommendation systems.", "### Out-of-Scope Use\n\nMisuse could include using the data for purposes other than educational analysis, such as commercial exploitation without proper licensing or using it to promote biased views in education.", "## Dataset Structure\n\n- ID: Unique identification number for each course\n- Label: Name or title of the course\n- Text: Description of the course including objectives and content\n- Subject Category: Field or domain to which the course belongs", "## Dataset Creation", "### Curation Rationale\n\nThe dataset is curated to provide a comprehensive overview of various educational courses, which can be instrumental in academic research, curriculum development, and educational analysis.", "### Source Data", "#### Data Collection and Processing\n\nDetails on the data collection and processing methodologies are not provided.", "## Bias, Risks, and Limitations\n\nUsers should be aware of potential biases in the representation of courses or subjects. The dataset's applicability might be limited by its scope, the diversity of courses, and the depth of descriptions.", "### Recommendations\n\nFurther information is needed to provide specific recommendations regarding the dataset's use, taking into account its biases, risks, and limitations.", "## More Information\n\nAdditional details about the dataset's origin, curation, and intended use are required for a comprehensive understanding and responsible application." ]
[ "TAGS\n#region-us \n", "# Dataset Card for Educational Course Descriptions Dataset", "## Dataset Details", "### Dataset Description\n\n- Curated by: Unknown\n- Language(s) (NLP): chinese\n- License: MIT\n\nThis dataset comprises a diverse collection of educational course descriptions from various fields, including medicine, engineering, social sciences, and more. Each record contains an identification number, a course label, a detailed text describing the course, and the subject category. The texts offer insights into course content, objectives, and distinctive features, emphasizing each course's relevance and application in its respective field.", "## Uses", "### Direct Use\n\nThe dataset is ideal for analyzing and categorizing educational content, understanding curriculum structures, and developing educational resources or recommendation systems. It can be used in natural language processing to train models for text classification, summarization, and recommendation systems.", "### Out-of-Scope Use\n\nMisuse could include using the data for purposes other than educational analysis, such as commercial exploitation without proper licensing or using it to promote biased views in education.", "## Dataset Structure\n\n- ID: Unique identification number for each course\n- Label: Name or title of the course\n- Text: Description of the course including objectives and content\n- Subject Category: Field or domain to which the course belongs", "## Dataset Creation", "### Curation Rationale\n\nThe dataset is curated to provide a comprehensive overview of various educational courses, which can be instrumental in academic research, curriculum development, and educational analysis.", "### Source Data", "#### Data Collection and Processing\n\nDetails on the data collection and processing methodologies are not provided.", "## Bias, Risks, and Limitations\n\nUsers should be aware of potential biases in the representation of courses or subjects. The dataset's applicability might be limited by its scope, the diversity of courses, and the depth of descriptions.", "### Recommendations\n\nFurther information is needed to provide specific recommendations regarding the dataset's use, taking into account its biases, risks, and limitations.", "## More Information\n\nAdditional details about the dataset's origin, curation, and intended use are required for a comprehensive understanding and responsible application." ]
6b40fc4f4a3a14f25b94b19dc4e9db02f1771386
# Dataset Card for Evaluation run of huseyinatahaninan/phi-2-dpo <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [huseyinatahaninan/phi-2-dpo](https://huggingface.co/huseyinatahaninan/phi-2-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_huseyinatahaninan__phi-2-dpo", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-12T21:58:15.192256](https://huggingface.co/datasets/open-llm-leaderboard/details_huseyinatahaninan__phi-2-dpo/blob/main/results_2024-02-12T21-58-15.192256.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5870761708653485, "acc_stderr": 0.03369469581974977, "acc_norm": 0.5884353168964569, "acc_norm_stderr": 0.034381836157511524, "mc1": 0.3157894736842105, "mc1_stderr": 0.016272287957916912, "mc2": 0.45354154186159823, "mc2_stderr": 0.015221463708711597 }, "harness|arc:challenge|25": { "acc": 0.6040955631399317, "acc_stderr": 0.014291228393536588, "acc_norm": 0.6305460750853242, "acc_norm_stderr": 0.014104578366491897 }, "harness|hellaswag|10": { "acc": 0.5765783708424617, "acc_stderr": 0.004930911515084782, "acc_norm": 0.7635929097789285, "acc_norm_stderr": 0.004240066898702509 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4444444444444444, "acc_stderr": 0.04292596718256981, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.04292596718256981 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5986842105263158, "acc_stderr": 0.039889037033362836, "acc_norm": 0.5986842105263158, "acc_norm_stderr": 0.039889037033362836 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.04975698519562428, "acc_norm": 0.57, "acc_norm_stderr": 0.04975698519562428 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6150943396226415, "acc_stderr": 0.02994649856769995, "acc_norm": 0.6150943396226415, "acc_norm_stderr": 0.02994649856769995 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6805555555555556, "acc_stderr": 0.038990736873573344, "acc_norm": 0.6805555555555556, "acc_norm_stderr": 0.038990736873573344 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.39, "acc_stderr": 0.04902071300001975, "acc_norm": 0.39, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.42, "acc_stderr": 0.04960449637488584, "acc_norm": 0.42, "acc_norm_stderr": 0.04960449637488584 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5953757225433526, "acc_stderr": 0.03742461193887248, "acc_norm": 0.5953757225433526, "acc_norm_stderr": 0.03742461193887248 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4215686274509804, "acc_stderr": 0.04913595201274498, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.04913595201274498 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.73, "acc_stderr": 0.04461960433384739, "acc_norm": 0.73, "acc_norm_stderr": 0.04461960433384739 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5106382978723404, "acc_stderr": 0.03267862331014063, "acc_norm": 0.5106382978723404, "acc_norm_stderr": 0.03267862331014063 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.3684210526315789, "acc_stderr": 0.04537815354939392, "acc_norm": 0.3684210526315789, "acc_norm_stderr": 0.04537815354939392 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.46825396825396826, "acc_stderr": 0.025699352832131796, "acc_norm": 0.46825396825396826, "acc_norm_stderr": 0.025699352832131796 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.373015873015873, "acc_stderr": 0.04325506042017086, "acc_norm": 0.373015873015873, "acc_norm_stderr": 0.04325506042017086 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7096774193548387, "acc_stderr": 0.025822106119415898, "acc_norm": 0.7096774193548387, "acc_norm_stderr": 0.025822106119415898 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4729064039408867, "acc_stderr": 0.03512819077876106, "acc_norm": 0.4729064039408867, "acc_norm_stderr": 0.03512819077876106 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.67, "acc_stderr": 0.04725815626252607, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252607 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6787878787878788, "acc_stderr": 0.036462049632538115, "acc_norm": 0.6787878787878788, "acc_norm_stderr": 0.036462049632538115 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7272727272727273, "acc_stderr": 0.03173071239071724, "acc_norm": 0.7272727272727273, "acc_norm_stderr": 0.03173071239071724 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8031088082901554, "acc_stderr": 0.02869787397186067, "acc_norm": 0.8031088082901554, "acc_norm_stderr": 0.02869787397186067 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5897435897435898, "acc_stderr": 0.024939313906940784, "acc_norm": 0.5897435897435898, "acc_norm_stderr": 0.024939313906940784 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.32222222222222224, "acc_stderr": 0.028493465091028597, "acc_norm": 0.32222222222222224, "acc_norm_stderr": 0.028493465091028597 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.592436974789916, "acc_stderr": 0.03191863374478465, "acc_norm": 0.592436974789916, "acc_norm_stderr": 0.03191863374478465 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.39072847682119205, "acc_stderr": 0.039837983066598075, "acc_norm": 0.39072847682119205, "acc_norm_stderr": 0.039837983066598075 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8055045871559633, "acc_stderr": 0.01697028909045803, "acc_norm": 0.8055045871559633, "acc_norm_stderr": 0.01697028909045803 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.48148148148148145, "acc_stderr": 0.03407632093854052, "acc_norm": 0.48148148148148145, "acc_norm_stderr": 0.03407632093854052 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6764705882352942, "acc_stderr": 0.03283472056108561, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.03283472056108561 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7468354430379747, "acc_stderr": 0.028304657943035296, "acc_norm": 0.7468354430379747, "acc_norm_stderr": 0.028304657943035296 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6502242152466368, "acc_stderr": 0.03200736719484503, "acc_norm": 0.6502242152466368, "acc_norm_stderr": 0.03200736719484503 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7175572519083969, "acc_stderr": 0.03948406125768361, "acc_norm": 0.7175572519083969, "acc_norm_stderr": 0.03948406125768361 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7355371900826446, "acc_stderr": 0.040261875275912046, "acc_norm": 0.7355371900826446, "acc_norm_stderr": 0.040261875275912046 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7129629629629629, "acc_stderr": 0.043733130409147614, "acc_norm": 0.7129629629629629, "acc_norm_stderr": 0.043733130409147614 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7607361963190185, "acc_stderr": 0.0335195387952127, "acc_norm": 0.7607361963190185, "acc_norm_stderr": 0.0335195387952127 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5267857142857143, "acc_stderr": 0.047389751192741546, "acc_norm": 0.5267857142857143, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.7281553398058253, "acc_stderr": 0.044052680241409216, "acc_norm": 0.7281553398058253, "acc_norm_stderr": 0.044052680241409216 }, "harness|hendrycksTest-marketing|5": { "acc": 0.811965811965812, "acc_stderr": 0.025598193686652265, "acc_norm": 0.811965811965812, "acc_norm_stderr": 0.025598193686652265 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6781609195402298, "acc_stderr": 0.0167063814150579, "acc_norm": 0.6781609195402298, "acc_norm_stderr": 0.0167063814150579 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6676300578034682, "acc_stderr": 0.025361168749688228, "acc_norm": 0.6676300578034682, "acc_norm_stderr": 0.025361168749688228 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24916201117318434, "acc_stderr": 0.014465893829859924, "acc_norm": 0.24916201117318434, "acc_norm_stderr": 0.014465893829859924 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6372549019607843, "acc_stderr": 0.02753007844711031, "acc_norm": 0.6372549019607843, "acc_norm_stderr": 0.02753007844711031 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6366559485530546, "acc_stderr": 0.027316847674192703, "acc_norm": 0.6366559485530546, "acc_norm_stderr": 0.027316847674192703 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6265432098765432, "acc_stderr": 0.02691500301138016, "acc_norm": 0.6265432098765432, "acc_norm_stderr": 0.02691500301138016 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4326241134751773, "acc_stderr": 0.029555454236778852, "acc_norm": 0.4326241134751773, "acc_norm_stderr": 0.029555454236778852 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4217731421121252, "acc_stderr": 0.012612974369390973, "acc_norm": 0.4217731421121252, "acc_norm_stderr": 0.012612974369390973 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.46691176470588236, "acc_stderr": 0.030306257722468314, "acc_norm": 0.46691176470588236, "acc_norm_stderr": 0.030306257722468314 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5637254901960784, "acc_stderr": 0.02006287424353913, "acc_norm": 0.5637254901960784, "acc_norm_stderr": 0.02006287424353913 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6545454545454545, "acc_stderr": 0.04554619617541054, "acc_norm": 0.6545454545454545, "acc_norm_stderr": 0.04554619617541054 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.710204081632653, "acc_stderr": 0.029043088683304328, "acc_norm": 0.710204081632653, "acc_norm_stderr": 0.029043088683304328 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8059701492537313, "acc_stderr": 0.027962677604768924, "acc_norm": 0.8059701492537313, "acc_norm_stderr": 0.027962677604768924 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.77, "acc_stderr": 0.042295258468165065, "acc_norm": 0.77, "acc_norm_stderr": 0.042295258468165065 }, "harness|hendrycksTest-virology|5": { "acc": 0.4759036144578313, "acc_stderr": 0.038879718495972646, "acc_norm": 0.4759036144578313, "acc_norm_stderr": 0.038879718495972646 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7076023391812866, "acc_stderr": 0.034886477134579215, "acc_norm": 0.7076023391812866, "acc_norm_stderr": 0.034886477134579215 }, "harness|truthfulqa:mc|0": { "mc1": 0.3157894736842105, "mc1_stderr": 0.016272287957916912, "mc2": 0.45354154186159823, "mc2_stderr": 0.015221463708711597 }, "harness|winogrande|5": { "acc": 0.7403314917127072, "acc_stderr": 0.012322700705552667 }, "harness|gsm8k|5": { "acc": 0.5670962850644428, "acc_stderr": 0.013647916362576054 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_huseyinatahaninan__phi-2-dpo
[ "region:us" ]
2024-02-12T21:59:58+00:00
{"pretty_name": "Evaluation run of huseyinatahaninan/phi-2-dpo", "dataset_summary": "Dataset automatically created during the evaluation run of model [huseyinatahaninan/phi-2-dpo](https://huggingface.co/huseyinatahaninan/phi-2-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_huseyinatahaninan__phi-2-dpo\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T21:58:15.192256](https://huggingface.co/datasets/open-llm-leaderboard/details_huseyinatahaninan__phi-2-dpo/blob/main/results_2024-02-12T21-58-15.192256.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5870761708653485,\n \"acc_stderr\": 0.03369469581974977,\n \"acc_norm\": 0.5884353168964569,\n \"acc_norm_stderr\": 0.034381836157511524,\n \"mc1\": 0.3157894736842105,\n \"mc1_stderr\": 0.016272287957916912,\n \"mc2\": 0.45354154186159823,\n \"mc2_stderr\": 0.015221463708711597\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6040955631399317,\n \"acc_stderr\": 0.014291228393536588,\n \"acc_norm\": 0.6305460750853242,\n \"acc_norm_stderr\": 0.014104578366491897\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5765783708424617,\n \"acc_stderr\": 0.004930911515084782,\n \"acc_norm\": 0.7635929097789285,\n \"acc_norm_stderr\": 0.004240066898702509\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.039889037033362836,\n \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.039889037033362836\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.03267862331014063,\n \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.03267862331014063\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.025699352832131796,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.025699352832131796\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7096774193548387,\n \"acc_stderr\": 0.025822106119415898,\n \"acc_norm\": 0.7096774193548387,\n \"acc_norm_stderr\": 0.025822106119415898\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.036462049632538115,\n \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.036462049632538115\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.02869787397186067,\n \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.02869787397186067\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.024939313906940784,\n \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.024939313906940784\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.592436974789916,\n \"acc_stderr\": 0.03191863374478465,\n \"acc_norm\": 0.592436974789916,\n \"acc_norm_stderr\": 0.03191863374478465\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8055045871559633,\n \"acc_stderr\": 0.01697028909045803,\n \"acc_norm\": 0.8055045871559633,\n \"acc_norm_stderr\": 0.01697028909045803\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854052,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854052\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03283472056108561,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03283472056108561\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7468354430379747,\n \"acc_stderr\": 0.028304657943035296,\n \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.028304657943035296\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912046,\n \"acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912046\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n \"acc_stderr\": 0.025598193686652265,\n \"acc_norm\": 0.811965811965812,\n \"acc_norm_stderr\": 0.025598193686652265\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6781609195402298,\n \"acc_stderr\": 0.0167063814150579,\n \"acc_norm\": 0.6781609195402298,\n \"acc_norm_stderr\": 0.0167063814150579\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688228,\n \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688228\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n \"acc_stderr\": 0.014465893829859924,\n \"acc_norm\": 0.24916201117318434,\n \"acc_norm_stderr\": 0.014465893829859924\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6372549019607843,\n \"acc_stderr\": 0.02753007844711031,\n \"acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.02753007844711031\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n \"acc_stderr\": 0.027316847674192703,\n \"acc_norm\": 0.6366559485530546,\n \"acc_norm_stderr\": 0.027316847674192703\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.02691500301138016,\n \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.02691500301138016\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4326241134751773,\n \"acc_stderr\": 0.029555454236778852,\n \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.029555454236778852\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4217731421121252,\n \"acc_stderr\": 0.012612974369390973,\n \"acc_norm\": 0.4217731421121252,\n \"acc_norm_stderr\": 0.012612974369390973\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.46691176470588236,\n \"acc_stderr\": 0.030306257722468314,\n \"acc_norm\": 0.46691176470588236,\n \"acc_norm_stderr\": 0.030306257722468314\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5637254901960784,\n \"acc_stderr\": 0.02006287424353913,\n \"acc_norm\": 0.5637254901960784,\n \"acc_norm_stderr\": 0.02006287424353913\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n \"acc_stderr\": 0.027962677604768924,\n \"acc_norm\": 0.8059701492537313,\n \"acc_norm_stderr\": 0.027962677604768924\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7076023391812866,\n \"acc_stderr\": 0.034886477134579215,\n \"acc_norm\": 0.7076023391812866,\n \"acc_norm_stderr\": 0.034886477134579215\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3157894736842105,\n \"mc1_stderr\": 0.016272287957916912,\n \"mc2\": 0.45354154186159823,\n \"mc2_stderr\": 0.015221463708711597\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7403314917127072,\n \"acc_stderr\": 0.012322700705552667\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5670962850644428,\n \"acc_stderr\": 0.013647916362576054\n }\n}\n```", "repo_url": "https://huggingface.co/huseyinatahaninan/phi-2-dpo", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|arc:challenge|25_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|gsm8k|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hellaswag|10_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T21-58-15.192256.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["**/details_harness|winogrande|5_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T21-58-15.192256.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T21_58_15.192256", "path": ["results_2024-02-12T21-58-15.192256.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T21-58-15.192256.parquet"]}]}]}
2024-02-12T22:00:24+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of huseyinatahaninan/phi-2-dpo Dataset automatically created during the evaluation run of model huseyinatahaninan/phi-2-dpo on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-12T21:58:15.192256(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of huseyinatahaninan/phi-2-dpo\n\n\n\nDataset automatically created during the evaluation run of model huseyinatahaninan/phi-2-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T21:58:15.192256(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of huseyinatahaninan/phi-2-dpo\n\n\n\nDataset automatically created during the evaluation run of model huseyinatahaninan/phi-2-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T21:58:15.192256(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
a77f72fe12cdba09b1d5819876520671706a1529
# Wikinews - weakly aligned multilingual pararell sentence datasets This dataset contains 15,200 multilingual WikiNews articles in 33 languages. Out of 15,200 articles, 9,960 are non-English news and 5240 are English news. All non-English news are linked to one of 5240 English news. Linked articles show the same event. List of non-English languages are: Spanish, French, German, Portuguese, Polish, Italian, Chinese, Russian, Japanese, Dutch, Swedish, Tamil, Serbian, Czech, Catalan, Hebrew, Turkish, Finnish, Esperanto, Greek, Hungarian, Ukrainian, Norwegian, Arabic, Persian, Korean, Romanian, Bulgarian, Bosnian, Limburgish, Albanian, Thai. ## Dataset Details ### Example raw datasets | | title | pageid | categories | lang | url | text | date | type | |---|-------------------------------------------------------------|--------|----------------------------------------------------|------|-----------------------------------------------------------------------------------------|-----------------------------------------------------------|-----------------------------|-----------------| | 0 | 'Bloody Sunday Inquiry' publishes report into ... | 191513 | [Northern Ireland, Martin McGuinness, Politics...] | en | https://en.wikinews.org/wiki/%27Bloody_Sunday_... | [On Tuesday, the "Bloody Sunday Inquiry" publi... | 2010-06-17 | title | | 1 | 1972 ”இரத்த ஞாயிறு” படுகொலைகள் தொடர்பில் பிரித... | 191513 | [Northern Ireland, Martin McGuinness, Politics...] | ta | https://ta.wikinews.org/wiki/1972_%E2%80%9D%E0... | [வடக்கு அயர்லாந்தில் 38 ஆண்டுகளுக்கு முன்னர் இ... | வியாழன், சூன் 17, 2010 | interlang link | | 2 | 'Very serious': Chinese government releases co... | 232226 | [China, December 30, 2010, Politics and confli...] | en | https://en.wikinews.org/wiki/%27Very_serious%2... | [A report by the Chinese government states cor... | 2010-12-30 | title | | 3 | Čína připustila, že tamní korupce je vážný pro... | 232226 | [China, December 30, 2010, Politics and confli...] | cs | https://cs.wikinews.org/wiki/%C4%8C%C3%ADna_p%... | [Zpráva čínské vlády připouští, že korupce v z... | Středa 29. prosince 2010 | interlang link | | 4 | China admite que la corrupción en el país es '... | 232226 | [China, December 30, 2010, Politics and confli...] | es | https://es.wikinews.org/wiki/China_admite_que_... | [29 de diciembre de 2010Beijing, China —, Un r... | None | interlang link | ### Variables Each data point includes following variables: | Field Name | Description | |-----------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------| | title | WikiNews article title | | pageid | pageid defined by the English WikiNews article. Data with the same pageid corresponds to the same news event linked together. | | categories | list of topics defined by WikiNews. All pages have at least one topic from [Crime and law, Culture and entertainment, Disasters and accidents, Economy and business, Education, Environment, Heath, Obituaries, Politics and conflicts, Science and technology, Sports, Wackynews, Weather] | | text | content of the article. Some foreign pages have news titles but no content. For those, text is left empty. | | lang | languages of the article (WP code, check [here](https://en.wikipedia.org/wiki/List_of_Wikipedias#Lists) for lists ) | | url | articles' URL | | date | date of publish in YYYY-MM-DD for English pages. Dates in foreign pages were left as it is. To get a date with YYYY-MM-DD format, look for a English page with the same pageid. | | type | `title` for the English page, `interlang link` for non-English page linked to the English page with the `pageid` ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** Fumika Isono, Primer AI - **Language(s) (NLP):** en, es, fr, de, pt, pl, it, zh, ru, ja, nl, sv, ta, sr, cs, ca, he, tr, fi, eo, el, hu, uk, 'no', ar, fa, ko, ro, bg, bs, li, sq, th - **License:** cc-by-2.5 ### Dataset Sources <!-- Provide the basic links for the dataset. --> - **Repository:** [Github](https://github.com/PrimerAI/primer-research/tree/main) - **Paper:** ArXiv [Linear Cross-Lingual Mapping of Sentence Embeddings](https://arxiv.org/abs/2305.14256) ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Weakly aligned multilingual pararell sentence datasets Weakly aligned multilingual pararell sentence datasets can be constructed by comparing the titles and/or contents of the WikiNews pages that are linked to the same English WikiNews page (in the dataset, they have the same pageid). Following is the example case where titles of the same pageid are retrieved. These five phrases (news titles) are the news titles of the same incident. | News title | Language | type | |---------------------------------------------------------------|----------|-------------------| | Bomb blast in Delhi kills 12, injures 62 | English | title | | چندین کشته بر اثر انفجار بمب در مقابل دادگاه عالی هند | Farsi | title| | 9 נהרגו בפיגוע מחוץ לבית המשפט העליון של הודו | Hebrew | title| | У Индији 11 мртвих, 64 повређених | Serbian | title| | தில்லி உயர்நீதிமன்றத்தில் குண்டு வெடிப்பு, 10 பேர் உயிரிழப்பு | Tamil | title| ### Direct Use <!-- This section describes suitable use cases for the dataset. --> - Multilingual embeddings - Language comparison ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> [Wikinews](https://www.wikinews.org/) ## Dataset Card Authors Fumika Isono
Fumika/Wikinews-multilingual
[ "task_categories:text-classification", "task_categories:feature-extraction", "language:en", "language:es", "language:fr", "language:de", "language:pt", "language:pl", "language:it", "language:zh", "language:ru", "language:ja", "language:nl", "language:sv", "language:ta", "language:sr", "language:cs", "language:ca", "language:he", "language:tr", "language:fi", "language:eo", "language:el", "language:hu", "language:uk", "language:no", "language:ar", "language:fa", "language:ko", "language:ro", "language:bg", "language:bs", "language:li", "language:sq", "language:th", "license:cc-by-2.5", "arxiv:2305.14256", "region:us" ]
2024-02-12T22:02:39+00:00
{"language": ["en", "es", "fr", "de", "pt", "pl", "it", "zh", "ru", "ja", "nl", "sv", "ta", "sr", "cs", "ca", "he", "tr", "fi", "eo", "el", "hu", "uk", "no", "ar", "fa", "ko", "ro", "bg", "bs", "li", "sq", "th"], "license": "cc-by-2.5", "task_categories": ["text-classification", "feature-extraction"]}
2024-02-12T22:56:09+00:00
[ "2305.14256" ]
[ "en", "es", "fr", "de", "pt", "pl", "it", "zh", "ru", "ja", "nl", "sv", "ta", "sr", "cs", "ca", "he", "tr", "fi", "eo", "el", "hu", "uk", "no", "ar", "fa", "ko", "ro", "bg", "bs", "li", "sq", "th" ]
TAGS #task_categories-text-classification #task_categories-feature-extraction #language-English #language-Spanish #language-French #language-German #language-Portuguese #language-Polish #language-Italian #language-Chinese #language-Russian #language-Japanese #language-Dutch #language-Swedish #language-Tamil #language-Serbian #language-Czech #language-Catalan #language-Hebrew #language-Turkish #language-Finnish #language-Esperanto #language-Modern Greek (1453-) #language-Hungarian #language-Ukrainian #language-Norwegian #language-Arabic #language-Persian #language-Korean #language-Romanian #language-Bulgarian #language-Bosnian #language-Limburgan #language-Albanian #language-Thai #license-cc-by-2.5 #arxiv-2305.14256 #region-us
Wikinews - weakly aligned multilingual pararell sentence datasets ================================================================= This dataset contains 15,200 multilingual WikiNews articles in 33 languages. Out of 15,200 articles, 9,960 are non-English news and 5240 are English news. All non-English news are linked to one of 5240 English news. Linked articles show the same event. List of non-English languages are: Spanish, French, German, Portuguese, Polish, Italian, Chinese, Russian, Japanese, Dutch, Swedish, Tamil, Serbian, Czech, Catalan, Hebrew, Turkish, Finnish, Esperanto, Greek, Hungarian, Ukrainian, Norwegian, Arabic, Persian, Korean, Romanian, Bulgarian, Bosnian, Limburgish, Albanian, Thai. Dataset Details --------------- ### Example raw datasets ### Variables Each data point includes following variables: ### Dataset Description * Curated by: Fumika Isono, Primer AI * Language(s) (NLP): en, es, fr, de, pt, pl, it, zh, ru, ja, nl, sv, ta, sr, cs, ca, he, tr, fi, eo, el, hu, uk, 'no', ar, fa, ko, ro, bg, bs, li, sq, th * License: cc-by-2.5 ### Dataset Sources * Repository: Github * Paper: ArXiv Linear Cross-Lingual Mapping of Sentence Embeddings Uses ---- ### Weakly aligned multilingual pararell sentence datasets Weakly aligned multilingual pararell sentence datasets can be constructed by comparing the titles and/or contents of the WikiNews pages that are linked to the same English WikiNews page (in the dataset, they have the same pageid). Following is the example case where titles of the same pageid are retrieved. These five phrases (news titles) are the news titles of the same incident. News title: Bomb blast in Delhi kills 12, injures 62, Language: English, type: title News title: چندین کشته بر اثر انفجار بمب در مقابل دادگاه عالی هند, Language: Farsi, type: title News title: 9 נהרגו בפיגוע מחוץ לבית המשפט העליון של הודו, Language: Hebrew, type: title News title: У Индији 11 мртвих, 64 повређених, Language: Serbian, type: title News title: தில்லி உயர்நீதிமன்றத்தில் குண்டு வெடிப்பு, 10 பேர் உயிரிழப்பு, Language: Tamil, type: title ### Direct Use * Multilingual embeddings * Language comparison ### Source Data Wikinews Dataset Card Authors -------------------- Fumika Isono
[ "### Example raw datasets", "### Variables\n\n\nEach data point includes following variables:", "### Dataset Description\n\n\n* Curated by: Fumika Isono, Primer AI\n* Language(s) (NLP): en, es, fr, de, pt, pl, it, zh, ru, ja, nl, sv, ta, sr, cs, ca, he, tr, fi, eo, el, hu, uk, 'no', ar, fa, ko, ro, bg, bs, li, sq, th\n* License: cc-by-2.5", "### Dataset Sources\n\n\n* Repository: Github\n* Paper: ArXiv Linear Cross-Lingual Mapping of Sentence Embeddings\n\n\nUses\n----", "### Weakly aligned multilingual pararell sentence datasets\n\n\nWeakly aligned multilingual pararell sentence datasets can be constructed by comparing the titles and/or contents of the WikiNews pages that are linked to the same English WikiNews page (in the dataset, they have the same pageid).\nFollowing is the example case where titles of the same pageid are retrieved. These five phrases (news titles) are the news titles of the same incident.\n\n\nNews title: Bomb blast in Delhi kills 12, injures 62, Language: English, type: title\nNews title: چندین کشته بر اثر انفجار بمب در مقابل دادگاه عالی هند, Language: Farsi, type: title\nNews title: 9 נהרגו בפיגוע מחוץ לבית המשפט העליון של הודו, Language: Hebrew, type: title\nNews title: У Индији 11 мртвих, 64 повређених, Language: Serbian, type: title\nNews title: தில்லி உயர்நீதிமன்றத்தில் குண்டு வெடிப்பு, 10 பேர் உயிரிழப்பு, Language: Tamil, type: title", "### Direct Use\n\n\n* Multilingual embeddings\n* Language comparison", "### Source Data\n\n\nWikinews\n\n\nDataset Card Authors\n--------------------\n\n\nFumika Isono" ]
[ "TAGS\n#task_categories-text-classification #task_categories-feature-extraction #language-English #language-Spanish #language-French #language-German #language-Portuguese #language-Polish #language-Italian #language-Chinese #language-Russian #language-Japanese #language-Dutch #language-Swedish #language-Tamil #language-Serbian #language-Czech #language-Catalan #language-Hebrew #language-Turkish #language-Finnish #language-Esperanto #language-Modern Greek (1453-) #language-Hungarian #language-Ukrainian #language-Norwegian #language-Arabic #language-Persian #language-Korean #language-Romanian #language-Bulgarian #language-Bosnian #language-Limburgan #language-Albanian #language-Thai #license-cc-by-2.5 #arxiv-2305.14256 #region-us \n", "### Example raw datasets", "### Variables\n\n\nEach data point includes following variables:", "### Dataset Description\n\n\n* Curated by: Fumika Isono, Primer AI\n* Language(s) (NLP): en, es, fr, de, pt, pl, it, zh, ru, ja, nl, sv, ta, sr, cs, ca, he, tr, fi, eo, el, hu, uk, 'no', ar, fa, ko, ro, bg, bs, li, sq, th\n* License: cc-by-2.5", "### Dataset Sources\n\n\n* Repository: Github\n* Paper: ArXiv Linear Cross-Lingual Mapping of Sentence Embeddings\n\n\nUses\n----", "### Weakly aligned multilingual pararell sentence datasets\n\n\nWeakly aligned multilingual pararell sentence datasets can be constructed by comparing the titles and/or contents of the WikiNews pages that are linked to the same English WikiNews page (in the dataset, they have the same pageid).\nFollowing is the example case where titles of the same pageid are retrieved. These five phrases (news titles) are the news titles of the same incident.\n\n\nNews title: Bomb blast in Delhi kills 12, injures 62, Language: English, type: title\nNews title: چندین کشته بر اثر انفجار بمب در مقابل دادگاه عالی هند, Language: Farsi, type: title\nNews title: 9 נהרגו בפיגוע מחוץ לבית המשפט העליון של הודו, Language: Hebrew, type: title\nNews title: У Индији 11 мртвих, 64 повређених, Language: Serbian, type: title\nNews title: தில்லி உயர்நீதிமன்றத்தில் குண்டு வெடிப்பு, 10 பேர் உயிரிழப்பு, Language: Tamil, type: title", "### Direct Use\n\n\n* Multilingual embeddings\n* Language comparison", "### Source Data\n\n\nWikinews\n\n\nDataset Card Authors\n--------------------\n\n\nFumika Isono" ]
0638a86690d76a15eedaeaa79a6cad2e07fba5f3
## Licensing Information All rights belong to their respective authors noted in reference column. Usage of this dataset is possible only for personal purposes on a non-commercial basis.
NMashalov/ru_educational_book_datasets
[ "region:us" ]
2024-02-12T22:11:05+00:00
{"dataset_info": {"features": [{"name": "page", "dtype": {"image": {"decode": false}}}, {"name": "label", "dtype": {"class_label": {"names": {"0": "1001", "1": "Bek_linal", "2": "Kozel-SM-Sbornik-zadach-po-obschemu-kursu-fiziki-Chast-2-Elektrichestvo-i-magnetizm-Optika", "3": "Stereo_prasolov", "4": "UssrVopros.ru_\u2605_ekstremalnye_zadachi-1977", "5": "algebra_10-11_kolmogorov", "6": "shen-geometry"}}}}, {"name": "reference", "dtype": "string"}, {"name": "base64encoding", "dtype": "string"}, {"name": "ocr", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1300878591.316, "num_examples": 3172}], "download_size": 1285364011, "dataset_size": 1300878591.316}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-02-13T21:14:12+00:00
[]
[]
TAGS #region-us
## Licensing Information All rights belong to their respective authors noted in reference column. Usage of this dataset is possible only for personal purposes on a non-commercial basis.
[ "## Licensing Information\n\nAll rights belong to their respective authors noted in reference column. Usage of this dataset is possible only for personal purposes on a non-commercial basis." ]
[ "TAGS\n#region-us \n", "## Licensing Information\n\nAll rights belong to their respective authors noted in reference column. Usage of this dataset is possible only for personal purposes on a non-commercial basis." ]
338adb70d4fe6fb5aa657c7c2c92ac8b5c7949a7
# Indonesian Literature Corpus ## Description This dataset contains a corpus in the Indonesian language taken from [Korpus Indonesia](https://korpusindonesia.kemdikbud.go.id/), provided by the Ministry of Education and Culture of the Republic of Indonesia. The corpus specifically focuses on literature texts, including various genres such as fiction, poetry, drama, and literary criticism. ## Contents The dataset consists of texts in the Indonesian language that are categorized under the field of literature. These texts encompass a wide range of literary works, including novels, short stories, poems, plays, and critical essays. ## Usage This dataset can be utilized for various research and development purposes related to Indonesian literature analysis, literary studies, authorship attribution, genre classification, sentiment analysis of literary texts, and other tasks within the domain of literature and humanities. ## Compatibility with Other Datasets Please note that there might be some overlap between this dataset and the dataset available at [DamarJati/indocorpus-mix](https://github.com/DamarJati/indocorpus-mix). Some sentences or passages in this dataset may also appear in the aforementioned dataset. ## License The dataset is retrieved from [Korpus Indonesia](https://korpusindonesia.kemdikbud.go.id/) provided by the Ministry of Education and Culture of the Republic of Indonesia. Please make sure to check and comply with the applicable licensing terms from the original source. ## References For more information about Korpus Indonesia, please visit [https://korpusindonesia.kemdikbud.go.id/](https://korpusindonesia.kemdikbud.go.id/).
DamarJati/indocorpus-sastra
[ "task_categories:text2text-generation", "size_categories:10K<n<100K", "language:id", "corpus", "indonesia", "text", "parquet", "region:us" ]
2024-02-12T22:43:53+00:00
{"language": ["id"], "size_categories": ["10K<n<100K"], "task_categories": ["text2text-generation"], "pretty_name": "Sastra Indonesia", "tags": ["corpus", "indonesia", "text", "parquet"]}
2024-02-12T22:56:35+00:00
[]
[ "id" ]
TAGS #task_categories-text2text-generation #size_categories-10K<n<100K #language-Indonesian #corpus #indonesia #text #parquet #region-us
# Indonesian Literature Corpus ## Description This dataset contains a corpus in the Indonesian language taken from Korpus Indonesia, provided by the Ministry of Education and Culture of the Republic of Indonesia. The corpus specifically focuses on literature texts, including various genres such as fiction, poetry, drama, and literary criticism. ## Contents The dataset consists of texts in the Indonesian language that are categorized under the field of literature. These texts encompass a wide range of literary works, including novels, short stories, poems, plays, and critical essays. ## Usage This dataset can be utilized for various research and development purposes related to Indonesian literature analysis, literary studies, authorship attribution, genre classification, sentiment analysis of literary texts, and other tasks within the domain of literature and humanities. ## Compatibility with Other Datasets Please note that there might be some overlap between this dataset and the dataset available at DamarJati/indocorpus-mix. Some sentences or passages in this dataset may also appear in the aforementioned dataset. ## License The dataset is retrieved from Korpus Indonesia provided by the Ministry of Education and Culture of the Republic of Indonesia. Please make sure to check and comply with the applicable licensing terms from the original source. ## References For more information about Korpus Indonesia, please visit URL
[ "# Indonesian Literature Corpus", "## Description\nThis dataset contains a corpus in the Indonesian language taken from Korpus Indonesia, provided by the Ministry of Education and Culture of the Republic of Indonesia. The corpus specifically focuses on literature texts, including various genres such as fiction, poetry, drama, and literary criticism.", "## Contents\nThe dataset consists of texts in the Indonesian language that are categorized under the field of literature. These texts encompass a wide range of literary works, including novels, short stories, poems, plays, and critical essays.", "## Usage\nThis dataset can be utilized for various research and development purposes related to Indonesian literature analysis, literary studies, authorship attribution, genre classification, sentiment analysis of literary texts, and other tasks within the domain of literature and humanities.", "## Compatibility with Other Datasets\nPlease note that there might be some overlap between this dataset and the dataset available at DamarJati/indocorpus-mix. Some sentences or passages in this dataset may also appear in the aforementioned dataset.", "## License\nThe dataset is retrieved from Korpus Indonesia provided by the Ministry of Education and Culture of the Republic of Indonesia. Please make sure to check and comply with the applicable licensing terms from the original source.", "## References\nFor more information about Korpus Indonesia, please visit URL" ]
[ "TAGS\n#task_categories-text2text-generation #size_categories-10K<n<100K #language-Indonesian #corpus #indonesia #text #parquet #region-us \n", "# Indonesian Literature Corpus", "## Description\nThis dataset contains a corpus in the Indonesian language taken from Korpus Indonesia, provided by the Ministry of Education and Culture of the Republic of Indonesia. The corpus specifically focuses on literature texts, including various genres such as fiction, poetry, drama, and literary criticism.", "## Contents\nThe dataset consists of texts in the Indonesian language that are categorized under the field of literature. These texts encompass a wide range of literary works, including novels, short stories, poems, plays, and critical essays.", "## Usage\nThis dataset can be utilized for various research and development purposes related to Indonesian literature analysis, literary studies, authorship attribution, genre classification, sentiment analysis of literary texts, and other tasks within the domain of literature and humanities.", "## Compatibility with Other Datasets\nPlease note that there might be some overlap between this dataset and the dataset available at DamarJati/indocorpus-mix. Some sentences or passages in this dataset may also appear in the aforementioned dataset.", "## License\nThe dataset is retrieved from Korpus Indonesia provided by the Ministry of Education and Culture of the Republic of Indonesia. Please make sure to check and comply with the applicable licensing terms from the original source.", "## References\nFor more information about Korpus Indonesia, please visit URL" ]
dbf1bd4f161be741654736aec291633fc3e4f660
# Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> Tokenizer: mbert Dataset: TASTEset Unshuffled ratio: 0 Shuffled ratio: 1 Drop duplicates: True Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/EW-TT-PE.json ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
pgajo/EW-TT-PE_U0_S1_Tingredient_P0.5_DROP1_mbert
[ "region:us" ]
2024-02-12T23:08:44+00:00
{}
2024-02-12T23:08:52+00:00
[]
[]
TAGS #region-us
# Dataset Card for Dataset Name Tokenizer: mbert Dataset: TASTEset Unshuffled ratio: 0 Shuffled ratio: 1 Drop duplicates: True Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/URL ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mbert\n\n Dataset: TASTEset\n\n Unshuffled ratio: 0\n\n Shuffled ratio: 1\n\n Drop duplicates: True\n\n Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/URL", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mbert\n\n Dataset: TASTEset\n\n Unshuffled ratio: 0\n\n Shuffled ratio: 1\n\n Drop duplicates: True\n\n Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/URL", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
b68a38e24de88cbe7a2270a088a1fdeb996a818b
# Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> Tokenizer: mdeberta Dataset: TASTEset Unshuffled ratio: 0 Shuffled ratio: 1 Drop duplicates: True Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/EW-TT-PE.json ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
pgajo/EW-TT-PE_U0_S1_Tingredient_P0.5_DROP1_mdeberta
[ "region:us" ]
2024-02-12T23:09:12+00:00
{}
2024-02-12T23:09:16+00:00
[]
[]
TAGS #region-us
# Dataset Card for Dataset Name Tokenizer: mdeberta Dataset: TASTEset Unshuffled ratio: 0 Shuffled ratio: 1 Drop duplicates: True Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/URL ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mdeberta\n\n Dataset: TASTEset\n\n Unshuffled ratio: 0\n\n Shuffled ratio: 1\n\n Drop duplicates: True\n\n Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/URL", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mdeberta\n\n Dataset: TASTEset\n\n Unshuffled ratio: 0\n\n Shuffled ratio: 1\n\n Drop duplicates: True\n\n Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/URL", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
7d946cc83720fc5986d8bbd489968b851cd9fb86
# Dataset Card for Evaluation run of rombodawg/Everyone-LLM-7b-Base <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [rombodawg/Everyone-LLM-7b-Base](https://huggingface.co/rombodawg/Everyone-LLM-7b-Base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_rombodawg__Everyone-LLM-7b-Base", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-12T23:53:27.239989](https://huggingface.co/datasets/open-llm-leaderboard/details_rombodawg__Everyone-LLM-7b-Base/blob/main/results_2024-02-12T23-53-27.239989.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6523061665050528, "acc_stderr": 0.03196249303085128, "acc_norm": 0.6532508904642476, "acc_norm_stderr": 0.03261034716522862, "mc1": 0.4039167686658507, "mc1_stderr": 0.01717727682258428, "mc2": 0.5788703327023509, "mc2_stderr": 0.015478489413186018 }, "harness|arc:challenge|25": { "acc": 0.6356655290102389, "acc_stderr": 0.014063260279882417, "acc_norm": 0.6638225255972696, "acc_norm_stderr": 0.01380485502620576 }, "harness|hellaswag|10": { "acc": 0.6771559450308704, "acc_stderr": 0.004666080865179642, "acc_norm": 0.8601872137024497, "acc_norm_stderr": 0.0034608394543291254 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6518518518518519, "acc_stderr": 0.041153246103369526, "acc_norm": 0.6518518518518519, "acc_norm_stderr": 0.041153246103369526 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6907894736842105, "acc_stderr": 0.037610708698674805, "acc_norm": 0.6907894736842105, "acc_norm_stderr": 0.037610708698674805 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.6, "acc_stderr": 0.04923659639173309, "acc_norm": 0.6, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7132075471698113, "acc_stderr": 0.027834912527544067, "acc_norm": 0.7132075471698113, "acc_norm_stderr": 0.027834912527544067 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7569444444444444, "acc_stderr": 0.035868792800803406, "acc_norm": 0.7569444444444444, "acc_norm_stderr": 0.035868792800803406 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.050161355804659205, "acc_norm": 0.53, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6705202312138728, "acc_stderr": 0.03583901754736412, "acc_norm": 0.6705202312138728, "acc_norm_stderr": 0.03583901754736412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.048971049527263666, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.048971049527263666 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.82, "acc_stderr": 0.03861229196653695, "acc_norm": 0.82, "acc_norm_stderr": 0.03861229196653695 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5872340425531914, "acc_stderr": 0.03218471141400351, "acc_norm": 0.5872340425531914, "acc_norm_stderr": 0.03218471141400351 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.047028804320496165, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.047028804320496165 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5310344827586206, "acc_stderr": 0.04158632762097828, "acc_norm": 0.5310344827586206, "acc_norm_stderr": 0.04158632762097828 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.40476190476190477, "acc_stderr": 0.025279850397404904, "acc_norm": 0.40476190476190477, "acc_norm_stderr": 0.025279850397404904 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4603174603174603, "acc_stderr": 0.04458029125470973, "acc_norm": 0.4603174603174603, "acc_norm_stderr": 0.04458029125470973 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.36, "acc_stderr": 0.048241815132442176, "acc_norm": 0.36, "acc_norm_stderr": 0.048241815132442176 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7903225806451613, "acc_stderr": 0.023157879349083522, "acc_norm": 0.7903225806451613, "acc_norm_stderr": 0.023157879349083522 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5123152709359606, "acc_stderr": 0.035169204442208966, "acc_norm": 0.5123152709359606, "acc_norm_stderr": 0.035169204442208966 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.68, "acc_stderr": 0.04688261722621505, "acc_norm": 0.68, "acc_norm_stderr": 0.04688261722621505 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7696969696969697, "acc_stderr": 0.0328766675860349, "acc_norm": 0.7696969696969697, "acc_norm_stderr": 0.0328766675860349 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586818, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586818 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8911917098445595, "acc_stderr": 0.022473253332768766, "acc_norm": 0.8911917098445595, "acc_norm_stderr": 0.022473253332768766 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6743589743589744, "acc_stderr": 0.02375966576741229, "acc_norm": 0.6743589743589744, "acc_norm_stderr": 0.02375966576741229 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3888888888888889, "acc_stderr": 0.029723278961476664, "acc_norm": 0.3888888888888889, "acc_norm_stderr": 0.029723278961476664 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6890756302521008, "acc_stderr": 0.030066761582977934, "acc_norm": 0.6890756302521008, "acc_norm_stderr": 0.030066761582977934 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3509933774834437, "acc_stderr": 0.03896981964257375, "acc_norm": 0.3509933774834437, "acc_norm_stderr": 0.03896981964257375 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8403669724770643, "acc_stderr": 0.015703498348461787, "acc_norm": 0.8403669724770643, "acc_norm_stderr": 0.015703498348461787 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5092592592592593, "acc_stderr": 0.034093869469927006, "acc_norm": 0.5092592592592593, "acc_norm_stderr": 0.034093869469927006 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8137254901960784, "acc_stderr": 0.027325470966716312, "acc_norm": 0.8137254901960784, "acc_norm_stderr": 0.027325470966716312 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8016877637130801, "acc_stderr": 0.025955020841621115, "acc_norm": 0.8016877637130801, "acc_norm_stderr": 0.025955020841621115 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7786259541984732, "acc_stderr": 0.03641297081313728, "acc_norm": 0.7786259541984732, "acc_norm_stderr": 0.03641297081313728 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228733, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228733 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7962962962962963, "acc_stderr": 0.03893542518824847, "acc_norm": 0.7962962962962963, "acc_norm_stderr": 0.03893542518824847 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5, "acc_stderr": 0.04745789978762494, "acc_norm": 0.5, "acc_norm_stderr": 0.04745789978762494 }, "harness|hendrycksTest-management|5": { "acc": 0.7961165048543689, "acc_stderr": 0.03989139859531771, "acc_norm": 0.7961165048543689, "acc_norm_stderr": 0.03989139859531771 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406957, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406957 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.0446196043338474, "acc_norm": 0.73, "acc_norm_stderr": 0.0446196043338474 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8339719029374202, "acc_stderr": 0.0133064782430663, "acc_norm": 0.8339719029374202, "acc_norm_stderr": 0.0133064782430663 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7369942196531792, "acc_stderr": 0.02370309952525817, "acc_norm": 0.7369942196531792, "acc_norm_stderr": 0.02370309952525817 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.33854748603351953, "acc_stderr": 0.01582670009648135, "acc_norm": 0.33854748603351953, "acc_norm_stderr": 0.01582670009648135 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7483660130718954, "acc_stderr": 0.0248480182638752, "acc_norm": 0.7483660130718954, "acc_norm_stderr": 0.0248480182638752 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7106109324758842, "acc_stderr": 0.025755865922632945, "acc_norm": 0.7106109324758842, "acc_norm_stderr": 0.025755865922632945 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7530864197530864, "acc_stderr": 0.023993501709042117, "acc_norm": 0.7530864197530864, "acc_norm_stderr": 0.023993501709042117 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48226950354609927, "acc_stderr": 0.02980873964223777, "acc_norm": 0.48226950354609927, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4661016949152542, "acc_stderr": 0.01274085387294983, "acc_norm": 0.4661016949152542, "acc_norm_stderr": 0.01274085387294983 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7022058823529411, "acc_stderr": 0.027778298701545443, "acc_norm": 0.7022058823529411, "acc_norm_stderr": 0.027778298701545443 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6797385620915033, "acc_stderr": 0.018875682938069446, "acc_norm": 0.6797385620915033, "acc_norm_stderr": 0.018875682938069446 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7387755102040816, "acc_stderr": 0.02812342933514278, "acc_norm": 0.7387755102040816, "acc_norm_stderr": 0.02812342933514278 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8606965174129353, "acc_stderr": 0.024484487162913973, "acc_norm": 0.8606965174129353, "acc_norm_stderr": 0.024484487162913973 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.88, "acc_stderr": 0.03265986323710906, "acc_norm": 0.88, "acc_norm_stderr": 0.03265986323710906 }, "harness|hendrycksTest-virology|5": { "acc": 0.5421686746987951, "acc_stderr": 0.0387862677100236, "acc_norm": 0.5421686746987951, "acc_norm_stderr": 0.0387862677100236 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.02796678585916089, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.02796678585916089 }, "harness|truthfulqa:mc|0": { "mc1": 0.4039167686658507, "mc1_stderr": 0.01717727682258428, "mc2": 0.5788703327023509, "mc2_stderr": 0.015478489413186018 }, "harness|winogrande|5": { "acc": 0.8042620363062352, "acc_stderr": 0.011151145042218324 }, "harness|gsm8k|5": { "acc": 0.6557998483699773, "acc_stderr": 0.013086800426693784 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_rombodawg__Everyone-LLM-7b-Base
[ "region:us" ]
2024-02-12T23:55:50+00:00
{"pretty_name": "Evaluation run of rombodawg/Everyone-LLM-7b-Base", "dataset_summary": "Dataset automatically created during the evaluation run of model [rombodawg/Everyone-LLM-7b-Base](https://huggingface.co/rombodawg/Everyone-LLM-7b-Base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rombodawg__Everyone-LLM-7b-Base\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-12T23:53:27.239989](https://huggingface.co/datasets/open-llm-leaderboard/details_rombodawg__Everyone-LLM-7b-Base/blob/main/results_2024-02-12T23-53-27.239989.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6523061665050528,\n \"acc_stderr\": 0.03196249303085128,\n \"acc_norm\": 0.6532508904642476,\n \"acc_norm_stderr\": 0.03261034716522862,\n \"mc1\": 0.4039167686658507,\n \"mc1_stderr\": 0.01717727682258428,\n \"mc2\": 0.5788703327023509,\n \"mc2_stderr\": 0.015478489413186018\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6356655290102389,\n \"acc_stderr\": 0.014063260279882417,\n \"acc_norm\": 0.6638225255972696,\n \"acc_norm_stderr\": 0.01380485502620576\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6771559450308704,\n \"acc_stderr\": 0.004666080865179642,\n \"acc_norm\": 0.8601872137024497,\n \"acc_norm_stderr\": 0.0034608394543291254\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.029723278961476664,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.029723278961476664\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977934,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977934\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461787,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461787\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313728,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313728\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n \"acc_stderr\": 0.0133064782430663,\n \"acc_norm\": 0.8339719029374202,\n \"acc_norm_stderr\": 0.0133064782430663\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.02370309952525817,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.02370309952525817\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33854748603351953,\n \"acc_stderr\": 0.01582670009648135,\n \"acc_norm\": 0.33854748603351953,\n \"acc_norm_stderr\": 0.01582670009648135\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.0248480182638752,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.0248480182638752\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042117,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042117\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n \"acc_stderr\": 0.01274085387294983,\n \"acc_norm\": 0.4661016949152542,\n \"acc_norm_stderr\": 0.01274085387294983\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7022058823529411,\n \"acc_stderr\": 0.027778298701545443,\n \"acc_norm\": 0.7022058823529411,\n \"acc_norm_stderr\": 0.027778298701545443\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069446,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069446\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4039167686658507,\n \"mc1_stderr\": 0.01717727682258428,\n \"mc2\": 0.5788703327023509,\n \"mc2_stderr\": 0.015478489413186018\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8042620363062352,\n \"acc_stderr\": 0.011151145042218324\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6557998483699773,\n \"acc_stderr\": 0.013086800426693784\n }\n}\n```", "repo_url": "https://huggingface.co/rombodawg/Everyone-LLM-7b-Base", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|arc:challenge|25_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|gsm8k|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hellaswag|10_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-12T23-53-27.239989.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["**/details_harness|winogrande|5_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-12T23-53-27.239989.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_12T23_53_27.239989", "path": ["results_2024-02-12T23-53-27.239989.parquet"]}, {"split": "latest", "path": ["results_2024-02-12T23-53-27.239989.parquet"]}]}]}
2024-02-12T23:56:12+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of rombodawg/Everyone-LLM-7b-Base Dataset automatically created during the evaluation run of model rombodawg/Everyone-LLM-7b-Base on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-12T23:53:27.239989(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of rombodawg/Everyone-LLM-7b-Base\n\n\n\nDataset automatically created during the evaluation run of model rombodawg/Everyone-LLM-7b-Base on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T23:53:27.239989(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of rombodawg/Everyone-LLM-7b-Base\n\n\n\nDataset automatically created during the evaluation run of model rombodawg/Everyone-LLM-7b-Base on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-12T23:53:27.239989(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
b05b1a1cf64fc8b57f8df3bfee143de15afe8d99
# Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> Tokenizer: mdeberta Dataset: TASTEset Unshuffled ratio: 0 Shuffled ratio: 1 Shuffle probability: 0.25 Drop duplicates: True Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/EW-TT-PE.json ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
pgajo/EW-TT-PE_U0_S1_Tingredient_P0.25_DROP1_mdeberta
[ "region:us" ]
2024-02-13T00:57:56+00:00
{}
2024-02-13T00:58:03+00:00
[]
[]
TAGS #region-us
# Dataset Card for Dataset Name Tokenizer: mdeberta Dataset: TASTEset Unshuffled ratio: 0 Shuffled ratio: 1 Shuffle probability: 0.25 Drop duplicates: True Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/URL ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mdeberta\n\n Dataset: TASTEset\n\n Unshuffled ratio: 0\n\n Shuffled ratio: 1\n\n Shuffle probability: 0.25\n\n Drop duplicates: True\n\n Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/URL", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mdeberta\n\n Dataset: TASTEset\n\n Unshuffled ratio: 0\n\n Shuffled ratio: 1\n\n Shuffle probability: 0.25\n\n Drop duplicates: True\n\n Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/URL", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
df4d3cf5449345310e9a362121d2caec94cdb62f
# Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> This dataset contains ~13.8 million instances of taxi rides. ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> ### Dataset Sources [optional] [New York City Taxi Fare Prediction](https://www.kaggle.com/competitions/new-york-city-taxi-fare-prediction/data)
TaherMAfini/taxi_dataset
[ "region:us" ]
2024-02-13T00:58:20+00:00
{"dataset_info": {"features": [{"name": "key", "dtype": "string"}, {"name": "pickup_datetime", "dtype": "string"}, {"name": "pickup_longitude", "dtype": "float64"}, {"name": "pickup_latitude", "dtype": "float64"}, {"name": "dropoff_longitude", "dtype": "float64"}, {"name": "dropoff_latitude", "dtype": "float64"}, {"name": "passenger_count", "dtype": "int64"}, {"name": "fare_amount", "dtype": "float64"}], "splits": [{"name": "test", "num_bytes": 977751, "num_examples": 9914}], "download_size": 521219, "dataset_size": 977751}, "configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}, {"split": "train", "path": "data/train-*"}]}]}
2024-02-13T01:03:13+00:00
[]
[]
TAGS #region-us
# Dataset Card for Dataset Name This dataset contains ~13.8 million instances of taxi rides. ### Dataset Description ### Dataset Sources [optional] New York City Taxi Fare Prediction
[ "# Dataset Card for Dataset Name\n\n\n\nThis dataset contains ~13.8 million instances of taxi rides.", "### Dataset Description", "### Dataset Sources [optional]\n\nNew York City Taxi Fare Prediction" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Dataset Name\n\n\n\nThis dataset contains ~13.8 million instances of taxi rides.", "### Dataset Description", "### Dataset Sources [optional]\n\nNew York City Taxi Fare Prediction" ]
a1e646bfbecd8eb101f2c1fee11fbac393771a44
# Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> Tokenizer: mdeberta Dataset: TASTEset Unshuffled ratio: 0 Shuffled ratio: 1 Shuffle probability: 0.75 Drop duplicates: True Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/EW-TT-PE.json ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
pgajo/EW-TT-PE_U0_S1_Tingredient_P0.75_DROP1_mdeberta
[ "region:us" ]
2024-02-13T00:59:44+00:00
{}
2024-02-13T00:59:51+00:00
[]
[]
TAGS #region-us
# Dataset Card for Dataset Name Tokenizer: mdeberta Dataset: TASTEset Unshuffled ratio: 0 Shuffled ratio: 1 Shuffle probability: 0.75 Drop duplicates: True Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/URL ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mdeberta\n\n Dataset: TASTEset\n\n Unshuffled ratio: 0\n\n Shuffled ratio: 1\n\n Shuffle probability: 0.75\n\n Drop duplicates: True\n\n Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/URL", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mdeberta\n\n Dataset: TASTEset\n\n Unshuffled ratio: 0\n\n Shuffled ratio: 1\n\n Shuffle probability: 0.75\n\n Drop duplicates: True\n\n Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/URL", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
76aac8ce3c8520b118f46ad19aa6bdf271b25ab8
# Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> Tokenizer: mbert Dataset: TASTEset Unshuffled ratio: 0 Shuffled ratio: 1 Shuffle probability: 0.75 Drop duplicates: True Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/EW-TT-PE.json ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
pgajo/EW-TT-PE_U0_S1_Tingredient_P0.75_DROP1_mbert
[ "region:us" ]
2024-02-13T01:17:06+00:00
{}
2024-02-13T01:17:13+00:00
[]
[]
TAGS #region-us
# Dataset Card for Dataset Name Tokenizer: mbert Dataset: TASTEset Unshuffled ratio: 0 Shuffled ratio: 1 Shuffle probability: 0.75 Drop duplicates: True Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/URL ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mbert\n\n Dataset: TASTEset\n\n Unshuffled ratio: 0\n\n Shuffled ratio: 1\n\n Shuffle probability: 0.75\n\n Drop duplicates: True\n\n Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/URL", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mbert\n\n Dataset: TASTEset\n\n Unshuffled ratio: 0\n\n Shuffled ratio: 1\n\n Shuffle probability: 0.75\n\n Drop duplicates: True\n\n Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/URL", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
a0f17c5896685fbf5f40d33c4f54f736e062bad5
# Dataset Card for Dataset Name <!-- Provide a quick summary of the dataset. --> Tokenizer: mbert Dataset: TASTEset Unshuffled ratio: 0 Shuffled ratio: 1 Shuffle probability: 0.25 Drop duplicates: True Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/EW-TT-PE.json ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
pgajo/EW-TT-PE_U0_S1_Tingredient_P0.25_DROP1_mbert
[ "region:us" ]
2024-02-13T01:17:22+00:00
{}
2024-02-13T01:17:26+00:00
[]
[]
TAGS #region-us
# Dataset Card for Dataset Name Tokenizer: mbert Dataset: TASTEset Unshuffled ratio: 0 Shuffled ratio: 1 Shuffle probability: 0.25 Drop duplicates: True Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/URL ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mbert\n\n Dataset: TASTEset\n\n Unshuffled ratio: 0\n\n Shuffled ratio: 1\n\n Shuffle probability: 0.25\n\n Drop duplicates: True\n\n Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/URL", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Dataset Name\n\n\n\n\n Tokenizer: mbert\n\n Dataset: TASTEset\n\n Unshuffled ratio: 0\n\n Shuffled ratio: 1\n\n Shuffle probability: 0.25\n\n Drop duplicates: True\n\n Dataset path = /home/pgajo/working/food/data/TASTEset/data/EW-TASTE/URL", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
a838b7ce87bd43230b61a7e8e23fc18ad3d82346
Compiled neural activity dataset from GCaMP calcium imaging of _C. elegans_ from multiple experimental sources, standardized to a common format. ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/65cab82843207e438a9e7651/azsU9LuAl5N54FCTAK990.jpeg) CITATION: Simeon et al., 2024, "Scaling Properties for Artificial Neural Network Models of a Small Nervous System" (OpenReview.net)
qsimeon/celegans_neural_data
[ "language:en", "license:mit", "region:us" ]
2024-02-13T01:22:38+00:00
{"language": ["en"], "license": "mit", "configs": [{"config_name": "worm_data_short", "data_files": "worm_data_short.csv"}]}
2024-02-14T03:46:39+00:00
[]
[ "en" ]
TAGS #language-English #license-mit #region-us
Compiled neural activity dataset from GCaMP calcium imaging of _C. elegans_ from multiple experimental sources, standardized to a common format. !image/jpeg CITATION: Simeon et al., 2024, "Scaling Properties for Artificial Neural Network Models of a Small Nervous System" (URL)
[]
[ "TAGS\n#language-English #license-mit #region-us \n" ]
acb85e6e339d5a931d5586220a87782fa3a11326
# Dataset Card for Evaluation run of BarraHome/Lucie-7b-3e-5 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [BarraHome/Lucie-7b-3e-5](https://huggingface.co/BarraHome/Lucie-7b-3e-5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_BarraHome__Lucie-7b-3e-5", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-13T01:22:54.103083](https://huggingface.co/datasets/open-llm-leaderboard/details_BarraHome__Lucie-7b-3e-5/blob/main/results_2024-02-13T01-22-54.103083.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6032184784518743, "acc_stderr": 0.03333730204729809, "acc_norm": 0.607891645213564, "acc_norm_stderr": 0.03401402537730786, "mc1": 0.5226438188494492, "mc1_stderr": 0.01748554225848964, "mc2": 0.6766513448639357, "mc2_stderr": 0.015264009667659464 }, "harness|arc:challenge|25": { "acc": 0.575938566552901, "acc_stderr": 0.014441889627464392, "acc_norm": 0.6220136518771331, "acc_norm_stderr": 0.0141696645203031 }, "harness|hellaswag|10": { "acc": 0.6612228639713205, "acc_stderr": 0.004723266971563391, "acc_norm": 0.8481378211511651, "acc_norm_stderr": 0.0035815378475817935 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5777777777777777, "acc_stderr": 0.04266763404099582, "acc_norm": 0.5777777777777777, "acc_norm_stderr": 0.04266763404099582 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.625, "acc_stderr": 0.039397364351956274, "acc_norm": 0.625, "acc_norm_stderr": 0.039397364351956274 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6754716981132075, "acc_stderr": 0.02881561571343211, "acc_norm": 0.6754716981132075, "acc_norm_stderr": 0.02881561571343211 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.6597222222222222, "acc_stderr": 0.039621355734862175, "acc_norm": 0.6597222222222222, "acc_norm_stderr": 0.039621355734862175 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.47, "acc_stderr": 0.05016135580465919, "acc_norm": 0.47, "acc_norm_stderr": 0.05016135580465919 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5838150289017341, "acc_stderr": 0.03758517775404947, "acc_norm": 0.5838150289017341, "acc_norm_stderr": 0.03758517775404947 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.43137254901960786, "acc_stderr": 0.04928099597287534, "acc_norm": 0.43137254901960786, "acc_norm_stderr": 0.04928099597287534 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5148936170212766, "acc_stderr": 0.03267151848924777, "acc_norm": 0.5148936170212766, "acc_norm_stderr": 0.03267151848924777 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.43859649122807015, "acc_stderr": 0.04668000738510455, "acc_norm": 0.43859649122807015, "acc_norm_stderr": 0.04668000738510455 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5724137931034483, "acc_stderr": 0.041227371113703316, "acc_norm": 0.5724137931034483, "acc_norm_stderr": 0.041227371113703316 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.38095238095238093, "acc_stderr": 0.025010749116137602, "acc_norm": 0.38095238095238093, "acc_norm_stderr": 0.025010749116137602 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3888888888888889, "acc_stderr": 0.04360314860077459, "acc_norm": 0.3888888888888889, "acc_norm_stderr": 0.04360314860077459 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.6774193548387096, "acc_stderr": 0.026593084516572277, "acc_norm": 0.6774193548387096, "acc_norm_stderr": 0.026593084516572277 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7212121212121212, "acc_stderr": 0.03501438706296781, "acc_norm": 0.7212121212121212, "acc_norm_stderr": 0.03501438706296781 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7474747474747475, "acc_stderr": 0.030954055470365897, "acc_norm": 0.7474747474747475, "acc_norm_stderr": 0.030954055470365897 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.844559585492228, "acc_stderr": 0.026148483469153314, "acc_norm": 0.844559585492228, "acc_norm_stderr": 0.026148483469153314 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5564102564102564, "acc_stderr": 0.0251891498947642, "acc_norm": 0.5564102564102564, "acc_norm_stderr": 0.0251891498947642 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.337037037037037, "acc_stderr": 0.028820884666253255, "acc_norm": 0.337037037037037, "acc_norm_stderr": 0.028820884666253255 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.634453781512605, "acc_stderr": 0.031282177063684614, "acc_norm": 0.634453781512605, "acc_norm_stderr": 0.031282177063684614 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.038796870240733264, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.038796870240733264 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8018348623853211, "acc_stderr": 0.017090573804217905, "acc_norm": 0.8018348623853211, "acc_norm_stderr": 0.017090573804217905 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4398148148148148, "acc_stderr": 0.03385177976044812, "acc_norm": 0.4398148148148148, "acc_norm_stderr": 0.03385177976044812 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7549019607843137, "acc_stderr": 0.03019028245350195, "acc_norm": 0.7549019607843137, "acc_norm_stderr": 0.03019028245350195 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7510548523206751, "acc_stderr": 0.028146970599422644, "acc_norm": 0.7510548523206751, "acc_norm_stderr": 0.028146970599422644 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6322869955156951, "acc_stderr": 0.03236198350928275, "acc_norm": 0.6322869955156951, "acc_norm_stderr": 0.03236198350928275 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6946564885496184, "acc_stderr": 0.040393149787245605, "acc_norm": 0.6946564885496184, "acc_norm_stderr": 0.040393149787245605 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098824, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098824 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7037037037037037, "acc_stderr": 0.04414343666854933, "acc_norm": 0.7037037037037037, "acc_norm_stderr": 0.04414343666854933 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7300613496932515, "acc_stderr": 0.03487825168497892, "acc_norm": 0.7300613496932515, "acc_norm_stderr": 0.03487825168497892 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.45535714285714285, "acc_stderr": 0.04726835553719099, "acc_norm": 0.45535714285714285, "acc_norm_stderr": 0.04726835553719099 }, "harness|hendrycksTest-management|5": { "acc": 0.7475728155339806, "acc_stderr": 0.04301250399690878, "acc_norm": 0.7475728155339806, "acc_norm_stderr": 0.04301250399690878 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8589743589743589, "acc_stderr": 0.022801382534597552, "acc_norm": 0.8589743589743589, "acc_norm_stderr": 0.022801382534597552 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.68, "acc_stderr": 0.046882617226215034, "acc_norm": 0.68, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.7739463601532567, "acc_stderr": 0.014957458504335842, "acc_norm": 0.7739463601532567, "acc_norm_stderr": 0.014957458504335842 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6676300578034682, "acc_stderr": 0.025361168749688225, "acc_norm": 0.6676300578034682, "acc_norm_stderr": 0.025361168749688225 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.34972067039106147, "acc_stderr": 0.01594930879023364, "acc_norm": 0.34972067039106147, "acc_norm_stderr": 0.01594930879023364 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.6797385620915033, "acc_stderr": 0.02671611838015685, "acc_norm": 0.6797385620915033, "acc_norm_stderr": 0.02671611838015685 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6752411575562701, "acc_stderr": 0.026596782287697043, "acc_norm": 0.6752411575562701, "acc_norm_stderr": 0.026596782287697043 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.6759259259259259, "acc_stderr": 0.02604176620271716, "acc_norm": 0.6759259259259259, "acc_norm_stderr": 0.02604176620271716 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.450354609929078, "acc_stderr": 0.029680105565029036, "acc_norm": 0.450354609929078, "acc_norm_stderr": 0.029680105565029036 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.42698826597131684, "acc_stderr": 0.012633353557534427, "acc_norm": 0.42698826597131684, "acc_norm_stderr": 0.012633353557534427 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5992647058823529, "acc_stderr": 0.029768263528933105, "acc_norm": 0.5992647058823529, "acc_norm_stderr": 0.029768263528933105 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6111111111111112, "acc_stderr": 0.019722058939618068, "acc_norm": 0.6111111111111112, "acc_norm_stderr": 0.019722058939618068 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7, "acc_stderr": 0.04389311454644287, "acc_norm": 0.7, "acc_norm_stderr": 0.04389311454644287 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7346938775510204, "acc_stderr": 0.0282638899437846, "acc_norm": 0.7346938775510204, "acc_norm_stderr": 0.0282638899437846 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7512437810945274, "acc_stderr": 0.030567675938916714, "acc_norm": 0.7512437810945274, "acc_norm_stderr": 0.030567675938916714 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.81, "acc_stderr": 0.039427724440366255, "acc_norm": 0.81, "acc_norm_stderr": 0.039427724440366255 }, "harness|hendrycksTest-virology|5": { "acc": 0.5120481927710844, "acc_stderr": 0.03891364495835816, "acc_norm": 0.5120481927710844, "acc_norm_stderr": 0.03891364495835816 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.029170885500727665, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.029170885500727665 }, "harness|truthfulqa:mc|0": { "mc1": 0.5226438188494492, "mc1_stderr": 0.01748554225848964, "mc2": 0.6766513448639357, "mc2_stderr": 0.015264009667659464 }, "harness|winogrande|5": { "acc": 0.7679558011049724, "acc_stderr": 0.011864149691827936 }, "harness|gsm8k|5": { "acc": 0.3957543593631539, "acc_stderr": 0.013469823701048815 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_BarraHome__Lucie-7b-3e-5
[ "region:us" ]
2024-02-13T01:25:12+00:00
{"pretty_name": "Evaluation run of BarraHome/Lucie-7b-3e-5", "dataset_summary": "Dataset automatically created during the evaluation run of model [BarraHome/Lucie-7b-3e-5](https://huggingface.co/BarraHome/Lucie-7b-3e-5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BarraHome__Lucie-7b-3e-5\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T01:22:54.103083](https://huggingface.co/datasets/open-llm-leaderboard/details_BarraHome__Lucie-7b-3e-5/blob/main/results_2024-02-13T01-22-54.103083.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6032184784518743,\n \"acc_stderr\": 0.03333730204729809,\n \"acc_norm\": 0.607891645213564,\n \"acc_norm_stderr\": 0.03401402537730786,\n \"mc1\": 0.5226438188494492,\n \"mc1_stderr\": 0.01748554225848964,\n \"mc2\": 0.6766513448639357,\n \"mc2_stderr\": 0.015264009667659464\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.575938566552901,\n \"acc_stderr\": 0.014441889627464392,\n \"acc_norm\": 0.6220136518771331,\n \"acc_norm_stderr\": 0.0141696645203031\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6612228639713205,\n \"acc_stderr\": 0.004723266971563391,\n \"acc_norm\": 0.8481378211511651,\n \"acc_norm_stderr\": 0.0035815378475817935\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.5838150289017341,\n \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.03267151848924777,\n \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.03267151848924777\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137602,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137602\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6774193548387096,\n \"acc_stderr\": 0.026593084516572277,\n \"acc_norm\": 0.6774193548387096,\n \"acc_norm_stderr\": 0.026593084516572277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153314,\n \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153314\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5564102564102564,\n \"acc_stderr\": 0.0251891498947642,\n \"acc_norm\": 0.5564102564102564,\n \"acc_norm_stderr\": 0.0251891498947642\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8018348623853211,\n \"acc_stderr\": 0.017090573804217905,\n \"acc_norm\": 0.8018348623853211,\n \"acc_norm_stderr\": 0.017090573804217905\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044812,\n \"acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044812\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.040393149787245605,\n \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.040393149787245605\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597552,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597552\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7739463601532567,\n \"acc_stderr\": 0.014957458504335842,\n \"acc_norm\": 0.7739463601532567,\n \"acc_norm_stderr\": 0.014957458504335842\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688225,\n \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688225\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34972067039106147,\n \"acc_stderr\": 0.01594930879023364,\n \"acc_norm\": 0.34972067039106147,\n \"acc_norm_stderr\": 0.01594930879023364\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.02671611838015685,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.02671611838015685\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.02604176620271716,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.02604176620271716\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42698826597131684,\n \"acc_stderr\": 0.012633353557534427,\n \"acc_norm\": 0.42698826597131684,\n \"acc_norm_stderr\": 0.012633353557534427\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5992647058823529,\n \"acc_stderr\": 0.029768263528933105,\n \"acc_norm\": 0.5992647058823529,\n \"acc_norm_stderr\": 0.029768263528933105\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.019722058939618068,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.019722058939618068\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.7512437810945274,\n \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366255,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366255\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835816,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835816\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5226438188494492,\n \"mc1_stderr\": 0.01748554225848964,\n \"mc2\": 0.6766513448639357,\n \"mc2_stderr\": 0.015264009667659464\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7679558011049724,\n \"acc_stderr\": 0.011864149691827936\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3957543593631539,\n \"acc_stderr\": 0.013469823701048815\n }\n}\n```", "repo_url": "https://huggingface.co/BarraHome/Lucie-7b-3e-5", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|arc:challenge|25_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|gsm8k|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hellaswag|10_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T01-22-54.103083.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["**/details_harness|winogrande|5_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T01-22-54.103083.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T01_22_54.103083", "path": ["results_2024-02-13T01-22-54.103083.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T01-22-54.103083.parquet"]}]}]}
2024-02-13T01:25:36+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of BarraHome/Lucie-7b-3e-5 Dataset automatically created during the evaluation run of model BarraHome/Lucie-7b-3e-5 on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-13T01:22:54.103083(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of BarraHome/Lucie-7b-3e-5\n\n\n\nDataset automatically created during the evaluation run of model BarraHome/Lucie-7b-3e-5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-13T01:22:54.103083(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of BarraHome/Lucie-7b-3e-5\n\n\n\nDataset automatically created during the evaluation run of model BarraHome/Lucie-7b-3e-5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-13T01:22:54.103083(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
f33548270961273ad29fb425a07f5abbb908ae4f
# Dataset Card for Evaluation run of LHC88/DPOpenHermes-7B-v2-PerfLaser <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [LHC88/DPOpenHermes-7B-v2-PerfLaser](https://huggingface.co/LHC88/DPOpenHermes-7B-v2-PerfLaser) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_LHC88__DPOpenHermes-7B-v2-PerfLaser", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-13T01:23:03.882958](https://huggingface.co/datasets/open-llm-leaderboard/details_LHC88__DPOpenHermes-7B-v2-PerfLaser/blob/main/results_2024-02-13T01-23-03.882958.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.630320332246712, "acc_stderr": 0.03233621970243457, "acc_norm": 0.6320323733340334, "acc_norm_stderr": 0.032983598662158733, "mc1": 0.40514075887392903, "mc1_stderr": 0.017185611727753368, "mc2": 0.5906705349831306, "mc2_stderr": 0.015428183371279078 }, "harness|arc:challenge|25": { "acc": 0.6279863481228669, "acc_stderr": 0.014124597881844461, "acc_norm": 0.6638225255972696, "acc_norm_stderr": 0.013804855026205763 }, "harness|hellaswag|10": { "acc": 0.6547500497908784, "acc_stderr": 0.004744780201276634, "acc_norm": 0.8458474407488548, "acc_norm_stderr": 0.0036035695286784114 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5555555555555556, "acc_stderr": 0.04292596718256981, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.04292596718256981 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6842105263157895, "acc_stderr": 0.0378272898086547, "acc_norm": 0.6842105263157895, "acc_norm_stderr": 0.0378272898086547 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6867924528301886, "acc_stderr": 0.028544793319055326, "acc_norm": 0.6867924528301886, "acc_norm_stderr": 0.028544793319055326 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7291666666666666, "acc_stderr": 0.037161774375660185, "acc_norm": 0.7291666666666666, "acc_norm_stderr": 0.037161774375660185 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.46, "acc_stderr": 0.05009082659620332, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620332 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.26, "acc_stderr": 0.04408440022768078, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6011560693641619, "acc_stderr": 0.037336266553835096, "acc_norm": 0.6011560693641619, "acc_norm_stderr": 0.037336266553835096 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.35294117647058826, "acc_stderr": 0.047551296160629475, "acc_norm": 0.35294117647058826, "acc_norm_stderr": 0.047551296160629475 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.75, "acc_stderr": 0.04351941398892446, "acc_norm": 0.75, "acc_norm_stderr": 0.04351941398892446 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5446808510638298, "acc_stderr": 0.03255525359340354, "acc_norm": 0.5446808510638298, "acc_norm_stderr": 0.03255525359340354 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.47368421052631576, "acc_stderr": 0.046970851366478626, "acc_norm": 0.47368421052631576, "acc_norm_stderr": 0.046970851366478626 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.41798941798941797, "acc_stderr": 0.02540255550326091, "acc_norm": 0.41798941798941797, "acc_norm_stderr": 0.02540255550326091 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4365079365079365, "acc_stderr": 0.04435932892851466, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.04435932892851466 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.33, "acc_stderr": 0.047258156262526045, "acc_norm": 0.33, "acc_norm_stderr": 0.047258156262526045 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7645161290322581, "acc_stderr": 0.024137632429337717, "acc_norm": 0.7645161290322581, "acc_norm_stderr": 0.024137632429337717 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.67, "acc_stderr": 0.04725815626252607, "acc_norm": 0.67, "acc_norm_stderr": 0.04725815626252607 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7878787878787878, "acc_stderr": 0.031922715695483, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.031922715695483 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7878787878787878, "acc_stderr": 0.029126522834586818, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.029126522834586818 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8704663212435233, "acc_stderr": 0.02423353229775873, "acc_norm": 0.8704663212435233, "acc_norm_stderr": 0.02423353229775873 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6307692307692307, "acc_stderr": 0.024468615241478926, "acc_norm": 0.6307692307692307, "acc_norm_stderr": 0.024468615241478926 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.28888888888888886, "acc_stderr": 0.027634907264178544, "acc_norm": 0.28888888888888886, "acc_norm_stderr": 0.027634907264178544 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6680672268907563, "acc_stderr": 0.03058869701378364, "acc_norm": 0.6680672268907563, "acc_norm_stderr": 0.03058869701378364 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33112582781456956, "acc_stderr": 0.038425817186598696, "acc_norm": 0.33112582781456956, "acc_norm_stderr": 0.038425817186598696 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8220183486238533, "acc_stderr": 0.016399436366612893, "acc_norm": 0.8220183486238533, "acc_norm_stderr": 0.016399436366612893 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5509259259259259, "acc_stderr": 0.03392238405321617, "acc_norm": 0.5509259259259259, "acc_norm_stderr": 0.03392238405321617 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8235294117647058, "acc_stderr": 0.02675640153807897, "acc_norm": 0.8235294117647058, "acc_norm_stderr": 0.02675640153807897 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.810126582278481, "acc_stderr": 0.02553010046023349, "acc_norm": 0.810126582278481, "acc_norm_stderr": 0.02553010046023349 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.672645739910314, "acc_stderr": 0.03149384670994131, "acc_norm": 0.672645739910314, "acc_norm_stderr": 0.03149384670994131 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7709923664122137, "acc_stderr": 0.036853466317118506, "acc_norm": 0.7709923664122137, "acc_norm_stderr": 0.036853466317118506 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228733, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228733 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7314814814814815, "acc_stderr": 0.042844679680521934, "acc_norm": 0.7314814814814815, "acc_norm_stderr": 0.042844679680521934 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7607361963190185, "acc_stderr": 0.0335195387952127, "acc_norm": 0.7607361963190185, "acc_norm_stderr": 0.0335195387952127 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4375, "acc_stderr": 0.04708567521880525, "acc_norm": 0.4375, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.8349514563106796, "acc_stderr": 0.036756688322331886, "acc_norm": 0.8349514563106796, "acc_norm_stderr": 0.036756688322331886 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8717948717948718, "acc_stderr": 0.021901905115073332, "acc_norm": 0.8717948717948718, "acc_norm_stderr": 0.021901905115073332 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.7, "acc_stderr": 0.04605661864718381, "acc_norm": 0.7, "acc_norm_stderr": 0.04605661864718381 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8071519795657727, "acc_stderr": 0.014108533515757433, "acc_norm": 0.8071519795657727, "acc_norm_stderr": 0.014108533515757433 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6907514450867052, "acc_stderr": 0.02488314057007176, "acc_norm": 0.6907514450867052, "acc_norm_stderr": 0.02488314057007176 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.36089385474860336, "acc_stderr": 0.016062290671110452, "acc_norm": 0.36089385474860336, "acc_norm_stderr": 0.016062290671110452 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7418300653594772, "acc_stderr": 0.02505850331695814, "acc_norm": 0.7418300653594772, "acc_norm_stderr": 0.02505850331695814 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6977491961414791, "acc_stderr": 0.026082700695399672, "acc_norm": 0.6977491961414791, "acc_norm_stderr": 0.026082700695399672 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7222222222222222, "acc_stderr": 0.02492200116888633, "acc_norm": 0.7222222222222222, "acc_norm_stderr": 0.02492200116888633 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.46808510638297873, "acc_stderr": 0.029766675075873866, "acc_norm": 0.46808510638297873, "acc_norm_stderr": 0.029766675075873866 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4602346805736636, "acc_stderr": 0.012729785386598559, "acc_norm": 0.4602346805736636, "acc_norm_stderr": 0.012729785386598559 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6948529411764706, "acc_stderr": 0.0279715413701706, "acc_norm": 0.6948529411764706, "acc_norm_stderr": 0.0279715413701706 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6683006535947712, "acc_stderr": 0.01904748523936038, "acc_norm": 0.6683006535947712, "acc_norm_stderr": 0.01904748523936038 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7183673469387755, "acc_stderr": 0.02879518557429129, "acc_norm": 0.7183673469387755, "acc_norm_stderr": 0.02879518557429129 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8159203980099502, "acc_stderr": 0.027403859410786848, "acc_norm": 0.8159203980099502, "acc_norm_stderr": 0.027403859410786848 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.0368452949177471, "acc_norm": 0.84, "acc_norm_stderr": 0.0368452949177471 }, "harness|hendrycksTest-virology|5": { "acc": 0.5240963855421686, "acc_stderr": 0.03887971849597264, "acc_norm": 0.5240963855421686, "acc_norm_stderr": 0.03887971849597264 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8421052631578947, "acc_stderr": 0.02796678585916089, "acc_norm": 0.8421052631578947, "acc_norm_stderr": 0.02796678585916089 }, "harness|truthfulqa:mc|0": { "mc1": 0.40514075887392903, "mc1_stderr": 0.017185611727753368, "mc2": 0.5906705349831306, "mc2_stderr": 0.015428183371279078 }, "harness|winogrande|5": { "acc": 0.7861089187056038, "acc_stderr": 0.011524466954090255 }, "harness|gsm8k|5": { "acc": 0.6004548900682335, "acc_stderr": 0.013491660298815988 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_LHC88__DPOpenHermes-7B-v2-PerfLaser
[ "region:us" ]
2024-02-13T01:25:25+00:00
{"pretty_name": "Evaluation run of LHC88/DPOpenHermes-7B-v2-PerfLaser", "dataset_summary": "Dataset automatically created during the evaluation run of model [LHC88/DPOpenHermes-7B-v2-PerfLaser](https://huggingface.co/LHC88/DPOpenHermes-7B-v2-PerfLaser) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LHC88__DPOpenHermes-7B-v2-PerfLaser\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T01:23:03.882958](https://huggingface.co/datasets/open-llm-leaderboard/details_LHC88__DPOpenHermes-7B-v2-PerfLaser/blob/main/results_2024-02-13T01-23-03.882958.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.630320332246712,\n \"acc_stderr\": 0.03233621970243457,\n \"acc_norm\": 0.6320323733340334,\n \"acc_norm_stderr\": 0.032983598662158733,\n \"mc1\": 0.40514075887392903,\n \"mc1_stderr\": 0.017185611727753368,\n \"mc2\": 0.5906705349831306,\n \"mc2_stderr\": 0.015428183371279078\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6279863481228669,\n \"acc_stderr\": 0.014124597881844461,\n \"acc_norm\": 0.6638225255972696,\n \"acc_norm_stderr\": 0.013804855026205763\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6547500497908784,\n \"acc_stderr\": 0.004744780201276634,\n \"acc_norm\": 0.8458474407488548,\n \"acc_norm_stderr\": 0.0036035695286784114\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.037161774375660185,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.037161774375660185\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340354,\n \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340354\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n \"acc_stderr\": 0.024137632429337717,\n \"acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.024137632429337717\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.02423353229775873,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.02423353229775873\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.024468615241478926,\n \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.024468615241478926\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612893,\n \"acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612893\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.02675640153807897,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.02675640153807897\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.021901905115073332,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.021901905115073332\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n \"acc_stderr\": 0.014108533515757433,\n \"acc_norm\": 0.8071519795657727,\n \"acc_norm_stderr\": 0.014108533515757433\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36089385474860336,\n \"acc_stderr\": 0.016062290671110452,\n \"acc_norm\": 0.36089385474860336,\n \"acc_norm_stderr\": 0.016062290671110452\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.02505850331695814,\n \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.02505850331695814\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.026082700695399672,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.026082700695399672\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.02492200116888633,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.02492200116888633\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4602346805736636,\n \"acc_stderr\": 0.012729785386598559,\n \"acc_norm\": 0.4602346805736636,\n \"acc_norm_stderr\": 0.012729785386598559\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.0279715413701706,\n \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.0279715413701706\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786848,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786848\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.0368452949177471,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.0368452949177471\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40514075887392903,\n \"mc1_stderr\": 0.017185611727753368,\n \"mc2\": 0.5906705349831306,\n \"mc2_stderr\": 0.015428183371279078\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.011524466954090255\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6004548900682335,\n \"acc_stderr\": 0.013491660298815988\n }\n}\n```", "repo_url": "https://huggingface.co/LHC88/DPOpenHermes-7B-v2-PerfLaser", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|arc:challenge|25_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|gsm8k|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hellaswag|10_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T01-23-03.882958.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["**/details_harness|winogrande|5_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T01-23-03.882958.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T01_23_03.882958", "path": ["results_2024-02-13T01-23-03.882958.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T01-23-03.882958.parquet"]}]}]}
2024-02-13T01:25:48+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of LHC88/DPOpenHermes-7B-v2-PerfLaser Dataset automatically created during the evaluation run of model LHC88/DPOpenHermes-7B-v2-PerfLaser on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-13T01:23:03.882958(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of LHC88/DPOpenHermes-7B-v2-PerfLaser\n\n\n\nDataset automatically created during the evaluation run of model LHC88/DPOpenHermes-7B-v2-PerfLaser on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-13T01:23:03.882958(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of LHC88/DPOpenHermes-7B-v2-PerfLaser\n\n\n\nDataset automatically created during the evaluation run of model LHC88/DPOpenHermes-7B-v2-PerfLaser on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-13T01:23:03.882958(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
2bc41c351f143ed660cc659006b8c82063b49bb0
# Dataset Card for Evaluation run of Gille/StrangeMerges_22-7B-slerp <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_22-7B-slerp](https://huggingface.co/Gille/StrangeMerges_22-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Gille__StrangeMerges_22-7B-slerp", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-13T01:26:13.113566](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_22-7B-slerp/blob/main/results_2024-02-13T01-26-13.113566.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6543671954489716, "acc_stderr": 0.03205804055740569, "acc_norm": 0.6535998321857869, "acc_norm_stderr": 0.03273084993490379, "mc1": 0.6009791921664627, "mc1_stderr": 0.017142825728496763, "mc2": 0.7490450940434222, "mc2_stderr": 0.014305107509742374 }, "harness|arc:challenge|25": { "acc": 0.7167235494880546, "acc_stderr": 0.013167478735134575, "acc_norm": 0.7372013651877133, "acc_norm_stderr": 0.012862523175351335 }, "harness|hellaswag|10": { "acc": 0.719577773351922, "acc_stderr": 0.004482874732237349, "acc_norm": 0.8902609042023502, "acc_norm_stderr": 0.003119254828848947 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6592592592592592, "acc_stderr": 0.040943762699967926, "acc_norm": 0.6592592592592592, "acc_norm_stderr": 0.040943762699967926 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.03738520676119669, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.03738520676119669 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.64, "acc_stderr": 0.04824181513244218, "acc_norm": 0.64, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7018867924528301, "acc_stderr": 0.02815283794249386, "acc_norm": 0.7018867924528301, "acc_norm_stderr": 0.02815283794249386 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7708333333333334, "acc_stderr": 0.03514697467862388, "acc_norm": 0.7708333333333334, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.51, "acc_stderr": 0.05024183937956912, "acc_norm": 0.51, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.56, "acc_stderr": 0.049888765156985884, "acc_norm": 0.56, "acc_norm_stderr": 0.049888765156985884 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6589595375722543, "acc_stderr": 0.03614665424180826, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.03614665424180826 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.38235294117647056, "acc_stderr": 0.04835503696107224, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.04835503696107224 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5617021276595745, "acc_stderr": 0.03243618636108101, "acc_norm": 0.5617021276595745, "acc_norm_stderr": 0.03243618636108101 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4649122807017544, "acc_stderr": 0.046920083813689104, "acc_norm": 0.4649122807017544, "acc_norm_stderr": 0.046920083813689104 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4021164021164021, "acc_stderr": 0.025253032554997692, "acc_norm": 0.4021164021164021, "acc_norm_stderr": 0.025253032554997692 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.48412698412698413, "acc_stderr": 0.04469881854072606, "acc_norm": 0.48412698412698413, "acc_norm_stderr": 0.04469881854072606 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7935483870967742, "acc_stderr": 0.023025899617188712, "acc_norm": 0.7935483870967742, "acc_norm_stderr": 0.023025899617188712 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5073891625615764, "acc_stderr": 0.035176035403610105, "acc_norm": 0.5073891625615764, "acc_norm_stderr": 0.035176035403610105 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7575757575757576, "acc_stderr": 0.03346409881055953, "acc_norm": 0.7575757575757576, "acc_norm_stderr": 0.03346409881055953 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8131313131313131, "acc_stderr": 0.027772533334218967, "acc_norm": 0.8131313131313131, "acc_norm_stderr": 0.027772533334218967 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9067357512953368, "acc_stderr": 0.02098685459328973, "acc_norm": 0.9067357512953368, "acc_norm_stderr": 0.02098685459328973 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.676923076923077, "acc_stderr": 0.02371088850197057, "acc_norm": 0.676923076923077, "acc_norm_stderr": 0.02371088850197057 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.02874204090394848, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.02874204090394848 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6764705882352942, "acc_stderr": 0.030388353551886793, "acc_norm": 0.6764705882352942, "acc_norm_stderr": 0.030388353551886793 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.39072847682119205, "acc_stderr": 0.03983798306659807, "acc_norm": 0.39072847682119205, "acc_norm_stderr": 0.03983798306659807 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8403669724770643, "acc_stderr": 0.015703498348461763, "acc_norm": 0.8403669724770643, "acc_norm_stderr": 0.015703498348461763 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5, "acc_stderr": 0.034099716973523674, "acc_norm": 0.5, "acc_norm_stderr": 0.034099716973523674 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8431372549019608, "acc_stderr": 0.025524722324553346, "acc_norm": 0.8431372549019608, "acc_norm_stderr": 0.025524722324553346 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7932489451476793, "acc_stderr": 0.0263616516683891, "acc_norm": 0.7932489451476793, "acc_norm_stderr": 0.0263616516683891 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.034981493854624714, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.034981493854624714 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7933884297520661, "acc_stderr": 0.03695980128098824, "acc_norm": 0.7933884297520661, "acc_norm_stderr": 0.03695980128098824 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7870370370370371, "acc_stderr": 0.0395783547198098, "acc_norm": 0.7870370370370371, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.42857142857142855, "acc_stderr": 0.04697113923010212, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.04697113923010212 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384495, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384495 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8803418803418803, "acc_stderr": 0.021262719400406964, "acc_norm": 0.8803418803418803, "acc_norm_stderr": 0.021262719400406964 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8263090676883781, "acc_stderr": 0.01354741565866226, "acc_norm": 0.8263090676883781, "acc_norm_stderr": 0.01354741565866226 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7427745664739884, "acc_stderr": 0.023532925431044287, "acc_norm": 0.7427745664739884, "acc_norm_stderr": 0.023532925431044287 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4491620111731844, "acc_stderr": 0.01663583834163192, "acc_norm": 0.4491620111731844, "acc_norm_stderr": 0.01663583834163192 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7352941176470589, "acc_stderr": 0.025261691219729484, "acc_norm": 0.7352941176470589, "acc_norm_stderr": 0.025261691219729484 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.707395498392283, "acc_stderr": 0.02583989833487798, "acc_norm": 0.707395498392283, "acc_norm_stderr": 0.02583989833487798 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7469135802469136, "acc_stderr": 0.024191808600712992, "acc_norm": 0.7469135802469136, "acc_norm_stderr": 0.024191808600712992 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48226950354609927, "acc_stderr": 0.02980873964223777, "acc_norm": 0.48226950354609927, "acc_norm_stderr": 0.02980873964223777 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.47327249022164275, "acc_stderr": 0.012751977967676013, "acc_norm": 0.47327249022164275, "acc_norm_stderr": 0.012751977967676013 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6838235294117647, "acc_stderr": 0.028245687391462923, "acc_norm": 0.6838235294117647, "acc_norm_stderr": 0.028245687391462923 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.673202614379085, "acc_stderr": 0.018975427920507208, "acc_norm": 0.673202614379085, "acc_norm_stderr": 0.018975427920507208 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.044612721759105085, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.044612721759105085 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7387755102040816, "acc_stderr": 0.02812342933514278, "acc_norm": 0.7387755102040816, "acc_norm_stderr": 0.02812342933514278 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.025870646766169136, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.025870646766169136 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.83, "acc_stderr": 0.0377525168068637, "acc_norm": 0.83, "acc_norm_stderr": 0.0377525168068637 }, "harness|hendrycksTest-virology|5": { "acc": 0.5662650602409639, "acc_stderr": 0.03858158940685515, "acc_norm": 0.5662650602409639, "acc_norm_stderr": 0.03858158940685515 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.6009791921664627, "mc1_stderr": 0.017142825728496763, "mc2": 0.7490450940434222, "mc2_stderr": 0.014305107509742374 }, "harness|winogrande|5": { "acc": 0.8476716653512234, "acc_stderr": 0.010099208246065604 }, "harness|gsm8k|5": { "acc": 0.6974981046247157, "acc_stderr": 0.012652544133186141 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_Gille__StrangeMerges_22-7B-slerp
[ "region:us" ]
2024-02-13T01:28:33+00:00
{"pretty_name": "Evaluation run of Gille/StrangeMerges_22-7B-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_22-7B-slerp](https://huggingface.co/Gille/StrangeMerges_22-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__StrangeMerges_22-7B-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T01:26:13.113566](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_22-7B-slerp/blob/main/results_2024-02-13T01-26-13.113566.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6543671954489716,\n \"acc_stderr\": 0.03205804055740569,\n \"acc_norm\": 0.6535998321857869,\n \"acc_norm_stderr\": 0.03273084993490379,\n \"mc1\": 0.6009791921664627,\n \"mc1_stderr\": 0.017142825728496763,\n \"mc2\": 0.7490450940434222,\n \"mc2_stderr\": 0.014305107509742374\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7167235494880546,\n \"acc_stderr\": 0.013167478735134575,\n \"acc_norm\": 0.7372013651877133,\n \"acc_norm_stderr\": 0.012862523175351335\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.719577773351922,\n \"acc_stderr\": 0.004482874732237349,\n \"acc_norm\": 0.8902609042023502,\n \"acc_norm_stderr\": 0.003119254828848947\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249386,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249386\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997692,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997692\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.023025899617188712,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.023025899617188712\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659807,\n \"acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659807\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461763,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461763\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044287,\n \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044287\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4491620111731844,\n \"acc_stderr\": 0.01663583834163192,\n \"acc_norm\": 0.4491620111731844,\n \"acc_norm_stderr\": 0.01663583834163192\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729484,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729484\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712992,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712992\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47327249022164275,\n \"acc_stderr\": 0.012751977967676013,\n \"acc_norm\": 0.47327249022164275,\n \"acc_norm_stderr\": 0.012751977967676013\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6009791921664627,\n \"mc1_stderr\": 0.017142825728496763,\n \"mc2\": 0.7490450940434222,\n \"mc2_stderr\": 0.014305107509742374\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8476716653512234,\n \"acc_stderr\": 0.010099208246065604\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6974981046247157,\n \"acc_stderr\": 0.012652544133186141\n }\n}\n```", "repo_url": "https://huggingface.co/Gille/StrangeMerges_22-7B-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|arc:challenge|25_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|gsm8k|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hellaswag|10_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T01-26-13.113566.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["**/details_harness|winogrande|5_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T01-26-13.113566.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T01_26_13.113566", "path": ["results_2024-02-13T01-26-13.113566.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T01-26-13.113566.parquet"]}]}]}
2024-02-13T01:28:59+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of Gille/StrangeMerges_22-7B-slerp Dataset automatically created during the evaluation run of model Gille/StrangeMerges_22-7B-slerp on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-13T01:26:13.113566(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of Gille/StrangeMerges_22-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Gille/StrangeMerges_22-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-13T01:26:13.113566(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of Gille/StrangeMerges_22-7B-slerp\n\n\n\nDataset automatically created during the evaluation run of model Gille/StrangeMerges_22-7B-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-13T01:26:13.113566(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
6d417a32a387f8104359ffb50f9110f299e1aec4
# Dataset Card for "BIOGRID" Jan 24 version
lhallee/BIOGRID
[ "region:us" ]
2024-02-13T01:42:15+00:00
{"configs": [{"config_name": "default", "data_files": [{"split": "MV", "path": "data/MV-*"}, {"split": "EVERY", "path": "data/ALL-*"}]}], "dataset_info": {"features": [{"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "SeqA", "dtype": "string"}, {"name": "SeqB", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "MV", "num_bytes": 643086797, "num_examples": 463460}, {"name": "EVERY", "num_bytes": 3165529028, "num_examples": 2552044}], "download_size": 1585982882, "dataset_size": 3808615825}}
2024-02-13T02:55:41+00:00
[]
[]
TAGS #region-us
# Dataset Card for "BIOGRID" Jan 24 version
[ "# Dataset Card for \"BIOGRID\"\n\nJan 24 version" ]
[ "TAGS\n#region-us \n", "# Dataset Card for \"BIOGRID\"\n\nJan 24 version" ]
66a0dbcc565cf4c42bb142091b47b668ed184429
--- license: apache-2.0 task_categories: - translation tags: - biology - chemistry - AI size_categories: - 10K<n<100K
dzjxzyd/rhea_uniprot_reaction_small
[ "region:us" ]
2024-02-13T01:54:11+00:00
{}
2024-02-13T16:12:36+00:00
[]
[]
TAGS #region-us
--- license: apache-2.0 task_categories: - translation tags: - biology - chemistry - AI size_categories: - 10K<n<100K
[]
[ "TAGS\n#region-us \n" ]
084ed30a8fc708b9d69c06f41e6df03bb3ab6985
# Dataset Card for Evaluation run of jsfs11/MoEv4Config-TestWeightedTIES-7b <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [jsfs11/MoEv4Config-TestWeightedTIES-7b](https://huggingface.co/jsfs11/MoEv4Config-TestWeightedTIES-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_jsfs11__MoEv4Config-TestWeightedTIES-7b", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-13T02:02:25.718640](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__MoEv4Config-TestWeightedTIES-7b/blob/main/results_2024-02-13T02-02-25.718640.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6563638437086499, "acc_stderr": 0.032002661093451415, "acc_norm": 0.6557415111591477, "acc_norm_stderr": 0.032674509311910384, "mc1": 0.5422276621787026, "mc1_stderr": 0.017440965712482125, "mc2": 0.708709362408976, "mc2_stderr": 0.014616149007167033 }, "harness|arc:challenge|25": { "acc": 0.6860068259385665, "acc_stderr": 0.013562691224726295, "acc_norm": 0.7158703071672355, "acc_norm_stderr": 0.013179442447653884 }, "harness|hellaswag|10": { "acc": 0.6951802429794861, "acc_stderr": 0.004593902601979337, "acc_norm": 0.8818960366460864, "acc_norm_stderr": 0.0032207161266850255 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6518518518518519, "acc_stderr": 0.041153246103369526, "acc_norm": 0.6518518518518519, "acc_norm_stderr": 0.041153246103369526 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.03715062154998904, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.03715062154998904 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.6981132075471698, "acc_stderr": 0.02825420034443866, "acc_norm": 0.6981132075471698, "acc_norm_stderr": 0.02825420034443866 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7777777777777778, "acc_stderr": 0.03476590104304134, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.03476590104304134 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.54, "acc_stderr": 0.05009082659620333, "acc_norm": 0.54, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6647398843930635, "acc_stderr": 0.03599586301247077, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.03599586301247077 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4215686274509804, "acc_stderr": 0.04913595201274498, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.04913595201274498 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816506, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816506 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6, "acc_stderr": 0.03202563076101735, "acc_norm": 0.6, "acc_norm_stderr": 0.03202563076101735 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5517241379310345, "acc_stderr": 0.04144311810878152, "acc_norm": 0.5517241379310345, "acc_norm_stderr": 0.04144311810878152 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.4312169312169312, "acc_stderr": 0.02550648169813821, "acc_norm": 0.4312169312169312, "acc_norm_stderr": 0.02550648169813821 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5, "acc_stderr": 0.04472135954999579, "acc_norm": 0.5, "acc_norm_stderr": 0.04472135954999579 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7774193548387097, "acc_stderr": 0.023664216671642518, "acc_norm": 0.7774193548387097, "acc_norm_stderr": 0.023664216671642518 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.035179450386910616, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.03256866661681102, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.03256866661681102 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.028869778460267045, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.028869778460267045 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8963730569948186, "acc_stderr": 0.02199531196364424, "acc_norm": 0.8963730569948186, "acc_norm_stderr": 0.02199531196364424 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.658974358974359, "acc_stderr": 0.02403548967633508, "acc_norm": 0.658974358974359, "acc_norm_stderr": 0.02403548967633508 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.34444444444444444, "acc_stderr": 0.02897264888484427, "acc_norm": 0.34444444444444444, "acc_norm_stderr": 0.02897264888484427 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6848739495798319, "acc_stderr": 0.030176808288974337, "acc_norm": 0.6848739495798319, "acc_norm_stderr": 0.030176808288974337 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.36423841059602646, "acc_stderr": 0.03929111781242742, "acc_norm": 0.36423841059602646, "acc_norm_stderr": 0.03929111781242742 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8477064220183487, "acc_stderr": 0.015405084393157074, "acc_norm": 0.8477064220183487, "acc_norm_stderr": 0.015405084393157074 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5138888888888888, "acc_stderr": 0.034086558679777494, "acc_norm": 0.5138888888888888, "acc_norm_stderr": 0.034086558679777494 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8431372549019608, "acc_stderr": 0.02552472232455334, "acc_norm": 0.8431372549019608, "acc_norm_stderr": 0.02552472232455334 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8059071729957806, "acc_stderr": 0.025744902532290913, "acc_norm": 0.8059071729957806, "acc_norm_stderr": 0.025744902532290913 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6860986547085202, "acc_stderr": 0.031146796482972465, "acc_norm": 0.6860986547085202, "acc_norm_stderr": 0.031146796482972465 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7938931297709924, "acc_stderr": 0.035477710041594626, "acc_norm": 0.7938931297709924, "acc_norm_stderr": 0.035477710041594626 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7768595041322314, "acc_stderr": 0.03800754475228733, "acc_norm": 0.7768595041322314, "acc_norm_stderr": 0.03800754475228733 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.0401910747255735, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.0401910747255735 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.754601226993865, "acc_stderr": 0.03380939813943354, "acc_norm": 0.754601226993865, "acc_norm_stderr": 0.03380939813943354 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.7961165048543689, "acc_stderr": 0.039891398595317706, "acc_norm": 0.7961165048543689, "acc_norm_stderr": 0.039891398595317706 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.020930193185179326, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.020930193185179326 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.73, "acc_stderr": 0.044619604333847394, "acc_norm": 0.73, "acc_norm_stderr": 0.044619604333847394 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8199233716475096, "acc_stderr": 0.013740797258579825, "acc_norm": 0.8199233716475096, "acc_norm_stderr": 0.013740797258579825 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7456647398843931, "acc_stderr": 0.023445826276545543, "acc_norm": 0.7456647398843931, "acc_norm_stderr": 0.023445826276545543 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.42681564245810055, "acc_stderr": 0.016542401954631917, "acc_norm": 0.42681564245810055, "acc_norm_stderr": 0.016542401954631917 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7156862745098039, "acc_stderr": 0.02582916327275748, "acc_norm": 0.7156862745098039, "acc_norm_stderr": 0.02582916327275748 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7266881028938906, "acc_stderr": 0.025311765975426122, "acc_norm": 0.7266881028938906, "acc_norm_stderr": 0.025311765975426122 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7283950617283951, "acc_stderr": 0.024748624490537365, "acc_norm": 0.7283950617283951, "acc_norm_stderr": 0.024748624490537365 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.49645390070921985, "acc_stderr": 0.02982674915328092, "acc_norm": 0.49645390070921985, "acc_norm_stderr": 0.02982674915328092 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4680573663624511, "acc_stderr": 0.012744149704869649, "acc_norm": 0.4680573663624511, "acc_norm_stderr": 0.012744149704869649 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6838235294117647, "acc_stderr": 0.028245687391462927, "acc_norm": 0.6838235294117647, "acc_norm_stderr": 0.028245687391462927 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6813725490196079, "acc_stderr": 0.01885008469646872, "acc_norm": 0.6813725490196079, "acc_norm_stderr": 0.01885008469646872 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6909090909090909, "acc_stderr": 0.044262946482000985, "acc_norm": 0.6909090909090909, "acc_norm_stderr": 0.044262946482000985 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7387755102040816, "acc_stderr": 0.028123429335142783, "acc_norm": 0.7387755102040816, "acc_norm_stderr": 0.028123429335142783 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8258706467661692, "acc_stderr": 0.026814951200421603, "acc_norm": 0.8258706467661692, "acc_norm_stderr": 0.026814951200421603 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.5542168674698795, "acc_stderr": 0.03869543323472101, "acc_norm": 0.5542168674698795, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8538011695906432, "acc_stderr": 0.027097290118070806, "acc_norm": 0.8538011695906432, "acc_norm_stderr": 0.027097290118070806 }, "harness|truthfulqa:mc|0": { "mc1": 0.5422276621787026, "mc1_stderr": 0.017440965712482125, "mc2": 0.708709362408976, "mc2_stderr": 0.014616149007167033 }, "harness|winogrande|5": { "acc": 0.8382004735595896, "acc_stderr": 0.010350128010292406 }, "harness|gsm8k|5": { "acc": 0.7278241091736164, "acc_stderr": 0.01225971403516455 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
open-llm-leaderboard/details_jsfs11__MoEv4Config-TestWeightedTIES-7b
[ "region:us" ]
2024-02-13T02:04:45+00:00
{"pretty_name": "Evaluation run of jsfs11/MoEv4Config-TestWeightedTIES-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [jsfs11/MoEv4Config-TestWeightedTIES-7b](https://huggingface.co/jsfs11/MoEv4Config-TestWeightedTIES-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jsfs11__MoEv4Config-TestWeightedTIES-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-13T02:02:25.718640](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__MoEv4Config-TestWeightedTIES-7b/blob/main/results_2024-02-13T02-02-25.718640.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6563638437086499,\n \"acc_stderr\": 0.032002661093451415,\n \"acc_norm\": 0.6557415111591477,\n \"acc_norm_stderr\": 0.032674509311910384,\n \"mc1\": 0.5422276621787026,\n \"mc1_stderr\": 0.017440965712482125,\n \"mc2\": 0.708709362408976,\n \"mc2_stderr\": 0.014616149007167033\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6860068259385665,\n \"acc_stderr\": 0.013562691224726295,\n \"acc_norm\": 0.7158703071672355,\n \"acc_norm_stderr\": 0.013179442447653884\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6951802429794861,\n \"acc_stderr\": 0.004593902601979337,\n \"acc_norm\": 0.8818960366460864,\n \"acc_norm_stderr\": 0.0032207161266850255\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.02550648169813821,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.02550648169813821\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290913,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290913\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594626,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594626\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179326,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179326\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n \"acc_stderr\": 0.013740797258579825,\n \"acc_norm\": 0.8199233716475096,\n \"acc_norm_stderr\": 0.013740797258579825\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42681564245810055,\n \"acc_stderr\": 0.016542401954631917,\n \"acc_norm\": 0.42681564245810055,\n \"acc_norm_stderr\": 0.016542401954631917\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.024748624490537365,\n \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.024748624490537365\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n \"acc_stderr\": 0.012744149704869649,\n \"acc_norm\": 0.4680573663624511,\n \"acc_norm_stderr\": 0.012744149704869649\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5422276621787026,\n \"mc1_stderr\": 0.017440965712482125,\n \"mc2\": 0.708709362408976,\n \"mc2_stderr\": 0.014616149007167033\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8382004735595896,\n \"acc_stderr\": 0.010350128010292406\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7278241091736164,\n \"acc_stderr\": 0.01225971403516455\n }\n}\n```", "repo_url": "https://huggingface.co/jsfs11/MoEv4Config-TestWeightedTIES-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|arc:challenge|25_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|gsm8k|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hellaswag|10_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-13T02-02-25.718640.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["**/details_harness|winogrande|5_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-13T02-02-25.718640.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_13T02_02_25.718640", "path": ["results_2024-02-13T02-02-25.718640.parquet"]}, {"split": "latest", "path": ["results_2024-02-13T02-02-25.718640.parquet"]}]}]}
2024-02-13T02:05:10+00:00
[]
[]
TAGS #region-us
# Dataset Card for Evaluation run of jsfs11/MoEv4Config-TestWeightedTIES-7b Dataset automatically created during the evaluation run of model jsfs11/MoEv4Config-TestWeightedTIES-7b on the Open LLM Leaderboard. The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard). To load the details from a run, you can for instance do the following: ## Latest results These are the latest results from run 2024-02-13T02:02:25.718640(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ## Dataset Details ### Dataset Description - Curated by: - Funded by [optional]: - Shared by [optional]: - Language(s) (NLP): - License: ### Dataset Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Out-of-Scope Use ## Dataset Structure ## Dataset Creation ### Curation Rationale ### Source Data #### Data Collection and Processing #### Who are the source data producers? ### Annotations [optional] #### Annotation process #### Who are the annotators? #### Personal and Sensitive Information ## Bias, Risks, and Limitations ### Recommendations Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Dataset Card Authors [optional] ## Dataset Card Contact
[ "# Dataset Card for Evaluation run of jsfs11/MoEv4Config-TestWeightedTIES-7b\n\n\n\nDataset automatically created during the evaluation run of model jsfs11/MoEv4Config-TestWeightedTIES-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-13T02:02:25.718640(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
[ "TAGS\n#region-us \n", "# Dataset Card for Evaluation run of jsfs11/MoEv4Config-TestWeightedTIES-7b\n\n\n\nDataset automatically created during the evaluation run of model jsfs11/MoEv4Config-TestWeightedTIES-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:", "## Latest results\n\nThese are the latest results from run 2024-02-13T02:02:25.718640(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):", "## Dataset Details", "### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:", "### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Out-of-Scope Use", "## Dataset Structure", "## Dataset Creation", "### Curation Rationale", "### Source Data", "#### Data Collection and Processing", "#### Who are the source data producers?", "### Annotations [optional]", "#### Annotation process", "#### Who are the annotators?", "#### Personal and Sensitive Information", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Dataset Card Authors [optional]", "## Dataset Card Contact" ]
4e3a48b825f51138fcd88a0eaba075873b610754
July 23 Version
lhallee/HUMAN_PROTEOME
[ "region:us" ]
2024-02-13T02:20:35+00:00
{"dataset_info": {"features": [{"name": "ID", "dtype": "string"}, {"name": "Seq", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 27996808, "num_examples": 81434}], "download_size": 25674706, "dataset_size": 27996808}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
2024-02-13T02:21:04+00:00
[]
[]
TAGS #region-us
July 23 Version
[]
[ "TAGS\n#region-us \n" ]
9a4b11121786674ef595478fe866eb12cfd60ef1
Just some New Zealand court decisions data, probably needs some more cleaning. data ranges between 1975 to now and its all randomly selected 1158 cases
Princess3/Court_data_1k
[ "language:en", "license:wtfpl", "region:us" ]
2024-02-13T03:27:45+00:00
{"language": ["en"], "license": "wtfpl"}
2024-02-13T07:49:15+00:00
[]
[ "en" ]
TAGS #language-English #license-wtfpl #region-us
Just some New Zealand court decisions data, probably needs some more cleaning. data ranges between 1975 to now and its all randomly selected 1158 cases
[]
[ "TAGS\n#language-English #license-wtfpl #region-us \n" ]